Home Blog Page 554

Connectivity 2016 : Matthew Lempriere

BEYOND SPEED: FUTURE‑PROOFING YOUR NETWORK STRATEGY.

By Matthew Lempriere, Global Head of Financial Services at Telstra

Speed has always attracted a premium in financial markets. In the 19th century that might have involved carrier pigeons and telegraphs. In more recent times it has meant fibre optics and microwave technology. Either way, the proposition is simple: the faster you are, the more likely you will be to have an edge over others competing in the same space.

But achieving the lowest possible latency is no longer the be-all-end-all priority for market participants that it once was. Financial services firms are competing on a number of different levels and are facing a much wider array of challenges than they have in the past, from regulatory requirements to disruptive business practices that have resulted from new technology.

All of this has made network strategy a far more complicated endeavour. The penalty for getting it wrong can be huge, with high costs, resource-draining projects and locked-in limitations due to the long-term nature of network contracts. But the rewards for getting it right are even bigger, with business agility, client loyalty and a reduced cost base all becoming more attainable.

A new set of priorities

It was not that long ago that the most tech-savvy of trading firms would pay just about anything to shave off milliseconds or even microseconds from roundtrips between financial ecosystems. Network providers would blast through mountains and dig under waterways to make fibre-optic cable connections between financial centres as straight as possible. They would snap up the rights to use towers or build their own ones to create microwave networks that sent data even faster than cables.

They’re still doing all of that, and they can still command premiums from those firms whose business models absolutely depend on being among the very fastest to get their orders from one location to the next. But the speed game is no longer as lucrative as it once was and a number of factors such as technological change, cost pressures and an industry-wide focus on best execution are changing the landscape for what network users want and need. For many financial services firms, key considerations such as bandwidth, flexibility and even network maintenance are now rivalling latency as top priorities.

At a time when new applications are being developed at lightning speed, many banks have been struggling with the challenge of rolling out new digital services globally. These applications can take up large, or in some cases unknown, amounts of bandwidth. The last thing financial firms want to have to do is tinker with their networks every weekend. It means downtime and it can be resource-intensive, all of which can be costly. And that’s not even taking into account the risk of failure.

What this translates into is a need for high bandwidth, high performance and network scalability. As banks and other financial service providers expand their offerings, their network requirements change as well.

Connectivity concerns

One way to see this in action is to look at what’s been happening with Asian markets and new pools of liquidity. In the early days of the low latency revolution, most of the attention was focused on the main US and European centres, with Chicago-New York, London-Frankfurt and London-New York routes getting the bulk of the development. Put simply, where there was liquidity there was demand for more network speed. In recent years, as trading firms have sought to exploit new opportunities, Asia has come more into focus. Often that means long-haul routes where the millisecond roundtrip times can run into triple digits on fibre optic lines, say from Singapore or Australia to the United States. In this respect, network connectivity design can make an enormous difference.

Network connectivity also is taking on greater importance because of the spotlight on best execution. Latency is certainly important in terms of achieving the best available trades, but a certain degree of low latency performance is now considered a given in most markets. What can matter even more, however, is the ability to connect to different venues and financial ecosystems. Think of it this way: the wider and more efficient your network, the better the chance of having access to the best quotes and ultimately achieving best execution.

One way companies can quickly try out different liquidity centres is via software-defined networking (SDN). For instance, a company such as Telstra, which operates a global network that features connections to many hundreds of locations, can use SDN to allow firms to try out different exchanges. Using SDN, a network provider can control data flows across its network, giving priority to different data. That provides the performance guarantee that financial firms require with the added bonus of flexibility. For instance, a firm can try out a new venue for a short period of time without being locked into a multi-year network contract.

Flexibility is particularly important for firms that are looking to transition into new markets or to upgrade to new technology. This is where an optical transport network, or OTN, comes into play. An OTN is an industry standard protocol. At a networking level, an OTN can work with many kinds of legacy protocols. That allows a provider to build a network that can carry diverse data types, potentially providing firms with high bandwidth and high performance.

Having that bandwidth and performance produces savings because that’s what allows firms to roll out applications more easily and by reducing network maintenance. For banks, lowering the total cost of ownership and cutting costs in general is without doubt one of the top priorities. In fact, many of the new applications that they’re looking to roll out are actually part of wider efforts to trim costs.

At the same time, financial services firms need to be able to offer their clients iron-clad service. To do that, they need back-up networks. Trading firms can’t rely on one link or one provider. When the fastest networks can command massive premiums, coming up with cost-effective redundancy strategies becomes all the more important. In some cases, a single millisecond or two can inflate the price of a network enormously.

Focus on strategy

Hardware and software are by no means the whole story. What can make all the difference is having a cohesive strategy. When considering network requirements, a huge number of factors come into play. Firms need to think about what markets they’re likely to want to trade in, what venues they may want to do business on, what trading styles they may want to adopt, and a raft of other considerations in terms of not just the firm’s current activities but its business in the future. And in addition to long-term planning, firms can take a more tactical approach. Being able to dip a toe in the water in terms of a new market, a new activity or a new type of technology only makes sense.

Networks by their very nature involve numerous entities, whether that means trading firms, vendors or venues. Getting the right access and deciding on the best technological means of developing your network involves planning and partnerships with firms that can help develop a strategic approach. To borrow from an old phrase, a network strategy is a journey, not a destination.

To learn more about Telstra’s offerings for financial firms, including global network capabilities, consultancy services and the low latency EPL Express service, visit www.telstraglobal.com

©BestExecution 2016[divider_to_top]

[divider_line]

 

Connectivity 2016 : Aaron Wald

ADDRESSING THE COMPLEX NEEDS OF MODERN ELECTRONIC TRADING.

E013495

Aaron Wald, CTO of Vela Trading Technologies, talks about staying competitive at a time when trading connectivity is getting more complicated than ever.

In today’s electronic trading environment, market participants, both buyside and sellside, are facing challenges on a number of fronts.

On one front is an ever-changing and increasingly onerous regulatory landscape. Proposed new rules such as RegAT in the US and sections of MiFID II in Europe for example, will place a much greater burden on firms to ensure that their trading systems cannot disrupt the market. Market participants engaged in any kind of automated, algorithmic or electronic trading will need to consider how these new regulations will impact them and what actions they need to take in order to stay compliant.

On another front is the increased complexity of both the markets and the technology that underpins them. Markets are becoming more and more interconnected, with the number of instruments and asset classes traded electronically across multiple venues continuing to grow. Under this environment, firms face the combined challenge of how to process massive amounts of market data, how to connect and route their orders to the appropriate venues, and how to make best use of software, hardware, and network connectivity to do it all in a timely manner.

On top of the regulatory and market complexity challenges, firms need to consider how to stay competitive and profitable in an environment where margins are becoming thinner and thinner, particularly in the low-latency trading space. Gone are the days where small reductions in latency could lead to massive increases in revenues. In today’s electronic markets – where low latency is a pre-requisite to doing business – firms have to look beyond pure speed in order to gain a competitive edge.

So what steps can firms take to address these challenges?

Regulatory compliance

With much of the new regulations revolving around greater transparency, the onus is on banks and trading firms to provide regulators with deeper levels of data, not only around their trading activities, but also around their trading systems. Today, it is not enough for firms to be fast. In order to stay compliant, they also need to be transparent.

This means that trading systems should be self-reporting and provide accurate and meaningful statistics around areas such as latency monitoring, for example. If a firm is asked by the regulator to report on its true latencies, there should be the ability to identify not only where latency exists – whether it is in the software, the hardware or the network – but also under what conditions that latency changes.

Another key area that the regulators are focusing on is risk of market disruption. Upcoming regulations such as RegAT and MiFID II will go way beyond the pre-trade risk controls that were introduced a few years ago with SEC rule 15c3-5 in the US and the equivalent ESMA guidelines in Europe. Under the proposed legislation, many firms will now need to undertake substantial reviews of their systems and processes to ensure – and to prove to the regulator – that nothing they do will disrupt the market.

This challenge can be addressed by putting in place best practices around software development and ensuring the right levels of regression testing and release certification are followed. For example, with every exchange-driven change (EDC) that triggers a software release, clients of trading system vendors will want those vendors to explain exactly what is changing in that release and to prove that their regression testing procedures are sufficiently robust to not introduce any bug – either around the EDC itself or elsewhere in the system – that will have a negative impact on the market.

Addressing complexity

One unintended consequence of all these new regulations is that markets are becoming more and more complex, and market participants – already under pressure to be fast – now have to process and act upon ever increasing amounts of data in ever shorter time periods.

This means that firms are continually looking for new and innovative technology solutions to help them navigate through this complexity. The problem however, is that as the markets become more complex, much of the software and hardware surrounding the markets becomes more complex too, arguably leading to a situation where the markets are less safe, which is definitely not what the regulator intended.

So in order to help, rather than hinder firms in dealing with complexity, how can the various components of a trading solution best work together in an efficient, reliable and cost-effective way?

The answer lies in ensuring that the software and hardware underpinning the trading systems are working in harmony – the renowned high-performance computing and trading system expert Martin Thompson describes this as ‘Mechanical Sympathy’.

Hardware today is continually evolving in a way that CPUs are making more use of multiple cores and complex memory architectures. The latest Intel Skylake chips for example will even be shipped with an integrated FPGA (Field Programmable Gate Array) in the future. So the software running on these machines needs to really understand what the underlying components are doing in order to take advantage of all these characteristics.

This means that in order to address complexities in market infrastructure, trading software needs to be intelligently engineered to work in harmony with the underlying hardware, so that the application is getting the most out of the machine in terms of latency, throughput, and access. Rolling out EDCs on heterogeneous chips is a much more viable proposition than programming regression testing and certifying such changes on pure FPGA-based systems, for example.

Staying competitive

The firms who are able to handle this complexity are the ones who will gain a significant advantage going forward. Particularly in a multi-asset trading environment, where it is necessary to process large data sets and connect to multiple markets.

In the equities and exchange-traded derivatives space, where electronic trading is now relatively mature, margins today are thin. A few years ago, high frequency trading (HFT) firms could gain significant advantages by shaving a few fractions of a second off their order flow, but those opportunities are now few and far between, so firms have to look beyond pure speed to stay competitive.

This means that they are increasingly adopting multi-asset trading strategies, in instruments such as fixed income and FX, for example. However, this brings about new challenges in terms of sourcing and processing the requisite market data for these instruments in an integrated, consolidated, and normalised way.

Buyside firms who are able to process this data efficiently, and the sellside firms able to support those activities at scale, will steal a march on their competitors and continue to do well.

Looking ahead, it is likely that both buy and sellside firms will make increasing use of the efficiency of Cloud for aspects of their electronic trading ecosystems applications that are not latency sensitive. We are already seeing this with the use of streaming market data from RESTful APIs for example, to feed lower tier non-trading applications such as dynamic risk spreadsheets. In this regard, much of the messaging and middleware technology that has been implemented across banks within the last couple of decades could easily migrate to the Cloud.

With Cloud-based infrastructures designed to offer both simplicity and scale, and as people come to realise that many of their security fears around the Cloud are unfounded, firms will increasingly use such Cloud-based solutions, whether in a Public Cloud, Private Cloud, or Hybrid Cloud-type configuration.

Connecting it all together

With the electronic trading industry facing a number of pressures – from increased regulation, increased complexity, and shrinking margins – staying ahead of the competition will certainly present a challenge for both buyside and sellside firms in the coming months and years.

Consolidating their vendor relationships will positively help firms address these challenges, particularly where they are looking to rationalise costs across asset classes and move from a heavily siloed to a more integrated electronic trading environment.

Firms can significantly reduce both complexity and costs by reducing the number of vendors they use for areas such as market data, enterprise distribution and market access gateways.

Vendors who understand high performance trading and offer the latest market data technology, and who can provide the breadth of coverage and depth of expertise demanded by today’s multi-asset, multi-region electronic trading marketplace, will enable banks, brokers, and trading firms to stay focused on their business by bringing both efficiency and simplicity to their trading infrastructure.

©BestExecution 2016[divider_to_top]

[divider_line]

Connectivity 2016 : Andy Young

CONNECTED THINKING: A NEW APPROACH TO CAPITAL MARKETS CONNECTIVITY.

Andy Young, Specialist Sales Director, Capital Markets at Colt

Connectivity has always been at the heart of the capital markets. Without the global web of interconnections between trading venues, participants, clearing houses and regulators, the financial markets simply wouldn’t be what they are today.

The capital markets landscape is undergoing considerable change. The emergence in recent years of new technologies like cloud hosting and Software Defined Networking (SDN), plus the introduction of MiFID II, are causing participants to rethink how they operate and reassess what they need from their connectivity strategy.

Next generation networks are coming online that are more agile and responsive, allowing market players to customise capabilities according to their specific trading requirements. These networks also offer access to a fast-growing, interconnected global ecosystem made up of exchanges, trading venues and third parties offering everything from clearing house services to compliance testing, market data and cybersecurity solutions.

In this brave new world of heightened regulation, complex interconnectivity and technological change, trading firms face crucial decisions.

The pre-MiFID monopoly

In order to better understand where the industry is headed, it helps to see where we’ve come from. Arguably, MiFID I was the catalyst for the revolution in global trading that continues to unfold today.

Pre-MiFID I, trading was a monopoly. Central exchanges were the only venues where organised trading occurred; they set the prices, the commission levels and even stipulated how you connected to them. Everyone was forced to use the same connectivity provider and so speed was a fixed parameter, influenced only by distance from the venue.

MiFID I’s introduction released the stranglehold that established exchanges had on equity trading. But by allowing for the creation of alternative trading venues, MiFID I inadvertently created the conditions necessary for the latency race and thus the globalisation of trading.

MTFs sprang up across Europe, offering lower transaction costs, often better liquidity and the ability to connect to them with any provider. Connecting with any provider often meant that participants got to choose the fastest one. Best execution rules meant that any latency in trading systems needed to be minimised so that speed wasn’t compromised by the extra step of routing orders, and this extended to the physical connection to exchanges.

The growth in number of MTFs led to an exponential increase in the demand for connectivity in order to connect to this more complex and crowded market. This in turn led to providers boosting connection speeds through higher capacity fibre and data centres co-located at venues.

Making new connections, reaching new markets

Connectivity providers have subsequently developed financial extranet services that enable participants to connect to trading venues and market data feeds worldwide more quickly, more easily and more cost-effectively than was possible with point-to-point connections. Extranets have reduced the time and cost for trading firms to connect to and trade in new markets.

The “one-to-many” approach of extranets has made them the mainstay of connectivity strategies, and as they develop they are moving beyond providing pure connectivity. Increasing numbers of venues and third party suppliers are joining extranets in order to connect to an extended community of trading firms, brokerages, and other prospective customers. These suppliers range from algorithm testing and transaction reporting that helps firms meet their regulatory responsibilities to aggregated market data and content for making better, more informed trading decisions.

The effect of regulation: reporting and algo testing

The forthcoming MiFID II directive expands the capital markets ecosystem – and by extension the value of a financial extranet service to market participants. Trading firms, exchanges, and service providers must learn how to meet a range of regulatory requirements for all types of assets trading over the next 12 months. MiFID II’s focus on stringent reporting and testing policies means firms must now connect to both exchanges and venues to execute trades, and also to specialist providers (ARMs and APAs), whose services and solutions enable them to meet their regulatory responsibilities.

An example of this is MiFID II’s mandate for regular testing of trading algorithms. According to MiFID II, any firm that uses algorithms must carry out its own regular testing to prove to the regulator that its algorithms won’t disrupt a market, or malfunction under stress conditions.

The practical solution for algorithm traders is to test and monitor their algorithms in independent lab environments provided by specialist consultancies that simulate live market conditions. In this way, they can demonstrate independence and auditability to the regulators. They can also adjust and make necessary changes to their algorithms at very short notice, before they go live into the markets.

Algorithm traders will need to undergo this testing on an annual basis, and also whenever they make significant changes to an algorithm. An extranet can provide the fast and convenient access to secure third-party lab facilities so that regular testing can become routine.

The effect of regulation: best execution

Best execution is another important aspect of MiFID II that an extranet service can help with. MiFID II makes best execution compulsory and extends it to all asset classes, not only equities. It will be one of the more complicated aspects of MiFID II for market participants to implement: the task of connecting to multiple venues, comparing prices, ensuring quality of execution and then proving that quality, is a considerable challenge.

That said, an extranet service already possesses many of the fundamental properties needed to help firms with best execution compliance. They have the connections to a range of liquidity venues, plus access to market data and the catalogue of analytics and algorithms needed to trade. The global reach of an extranet means an accurate and complete market view is relatively straightforward to obtain. It also has the potential to act as a pseudo-consolidated tape that serves as a benchmark for measurement of best execution.

It’s not just new regulation that’s reshaping and redefining connectivity in capital markets. The adoption of new technologies by capital markets firms such as Software Defined Networking (SDN) and cloud computing can offer improved agility, flexibility and scalability – all at a lower cost.

The future is the cloud

The capital markets sector has traditionally been slow to adopt new technologies. However, more and more venues and trading firms are adopting secure private cloud-based hosting to take advantage of its greater flexibility, with the opportunity to scale up storage and IT resources (including complex computing capability) as and when required.

However, concerns around data security and the potential loss of access to critical apps and systems is an ongoing issue that affects firms’ uptake of public cloud services from third-party providers. As an alternative, some participants adopt a hybrid approach by moving less critical functions to the lower-cost public cloud, while retaining sensitive and business-critical systems in their own private proprietary clouds.

Use of the cloud to host key functions and features, such as a bank’s front office trading environment, is not yet widespread. But cost pressures and CAPEX savings will drive uptake in the longer term. The tipping point will come when we see whole venues moving to the cloud. When – rather than if – this migration takes hold, the issue of how to connect to these remotely hosted venues, platforms and services will take on added significance.

Software Defined Networking

Accompanying the trend towards private cloud-hosted trading platforms and services is the emergence of SDN over the last five years or so. SDN is a move away from the “static” architecture of traditional networks in favour of a virtualised approach that is more dynamic, flexible and scalable, but which importantly still delivers a reliable, resilient, low-latency connection to key cloud-hosted venues and exchanges.

The combination of the trend towards cloud and the emergence of SDN has the potential to trigger a new connectivity arms race in the capital markets. Fast, stable and secure connectivity to the cloud will become all-important.

But so too will agility. Organisations want to manage and customise the connectivity, functions and services they use via their networks. They want to adjust the bandwidth available to them, or introduce (“switch on”) new functions into the network like extra cybersecurity or a firewall, from a single dashboard.

A more agile, software-based network gives capital markets participants more control. They can dial up extra bandwidth and capacity to their network as and when they need it – for example, during peak trading time or when a major market-moving announcement is due, such as the latest US labour market data.

Conversely, they can reduce the bandwidth, at night for example, and use the network for order clearing: they could even turn it off entirely. All without delay and without having to renegotiate their SLA with their service provider. Rather than purchase a fixed amount of capacity to a particular location or venue for 12 months, trading firms will move connectivity between different participants and providers according to their specific needs.

Connectivity is king – again

SDN’s model of agile, customisable connectivity is the direction that capital markets will follow. Both SDN and the cloud offer participants greater flexibility, more connectivity and improved ROI. Participants will have a wider range of differently-priced tiered services from their provider to choose from. They will be able to select their preferred service and its level of guaranteed latency, reliability and redundancy based on their trading objectives, or their budget.

But technology advancements mean that today’s networks have reached their limit in latency vs cost terms. Alternative wireless technologies like microwave and millimetre wave offer ultra-low latency. However, their capacity and bandwidth is very limited at this stage. Research and investment continues into other early-stage connective technologies such as hollow fibre, which offers the ultra-low latency of wireless but the capacity of fibre.

Extranets – from one-to-one, to one-to-many

The increasingly international diversity of the capital markets marketplace, together with factors like the incoming MiFID II regulatory framework, has widened firms’ connectivity needs beyond low-latency connectivity. As well as having a global connective reach, an extranet service can provide participants with a broad range of value-added features and services. These include the ability to more quickly and easily connect to new markets and clients worldwide than was previously possible, in order for firms to take advantage of new and emerging trading opportunities as they occur.

An effective extranet also offers participants immediate access to a broad ecosystem of third party suppliers. In addition, its guaranteed deterministic latency enables the near real-time delivery of market-moving content and data, as and when it’s needed. Built-in real-time performance monitoring makes sure participants receive the requisite level of service and speed of connection that their provider promises in its Service Level Agreement.

Such a wide and varied range of advanced features and services can potentially be confusing and difficult for the client to manage and monitor. That’s why an extranet’s user interface should incorporate every feature and function, but in a way that’s accessible and user-friendly. A self-service portal fronted with an easy-to-use dashboard lets the participant conveniently pick and choose which features, services and providers they want to access and use, while also allowing them to continually check the quality of their connection.

Reaching the inflection point

The capital markets industry is today at an inflection point. The international expansion of capital markets and the growing global interconnectivity between exchanges, venues, suppliers and trading firms creates new and complex challenges, as well as new opportunities for growth. Cloud and SDN are also set to disrupt existing network and connectivity strategies.

Participants already face ongoing pressure to cut costs and stay ahead of the competition. But now they must also factor in new regulations like MiFID II that require them to be more transparent in their trading practices and reporting.

Just as MiFID I helped define capital markets as we know it today, these factors are defining the markets of today and tomorrow. As such, participants must rethink their connectivity strategy, for now and into the future.

In today’s increasingly complex capital markets sector, connectivity will more than ever be the vital difference between success and failure for trading firms. They should therefore look for solutions that can meet and manage their current data, service and connectivity needs: and which also have the scalability, flexibility and foresight to adapt, evolve and expand their functions and services, according to the changing conditions and demands of the market.

©BestExecution 2016[divider_to_top]

[divider_line]

Connectivity 2016 : Andrew Rossiter

TRADING IS NOW SMARTER.

E013495

In the last several years mobile trading reached new heights with a level of penetration that many never thought possible. But this is only the starting point and the industry has a long way to go, according to ADS Securities CTO, Andrew Rossiter.

As an industry we may have overlooked the real reason why mobile trading has expanded so quickly and widely. We still look at a smartphone as a device we can place a trading platform onto. This is the wrong approach and we’ve only recently started to learn from the mobile app industry. They know that a smartphone is part of a lifestyle and you need to understand that lifestyle, whether you are a trading app or chat network.

Globally many millions of people now have access to relatively cheap, but very powerful smartphones. They are in the hands of new users from emerging middle classes in markets as diverse as China, South America and Africa. These are an educated, aspirational and very ‘phone’ literate audience. They may never have owned a desktop, or even a laptop, but are expert in accessing, managing and using data on mobile devices. They know this platform and need us to deliver a product that works in this handheld environment.

If we want to interact with smartphone users we have to be aware that we are entering an ecosystem where these devices are essential to the way they live their lives. Modern smartphones control their social life, it provides their retail therapy, news flow, business communications and even gets them a taxi. If you decide to enter this world you have to make sure that your offering is as efficient and easy to access as their favourite apps.

Trading apps have in the past tried to fit what we have on our desktops and understand, onto a screen that is less than 5 inches high. This is no longer an option and a lot of high quality development work is being carried out to make trading on a smartphone simple, fast and intuitive. Desktops do not have pinch-zoom, swipe or tap commands all of which offer new ways to access and control actions. This is about business communications for a generation which did not grow-up with the keyboard and a mouse. If the functionality of a trading app does not reflect the way others apps work, then it will not be used.

An app also provides a two-way flow of data which has not existed before in a trading environment. The big step forward is recognising that an app user expects the app provider to be using this data to tailor the product to his or her needs. If the app does not learn and improve the experience for the user it will be dropped. A smart app can access an infinite amount of data. It can record the choices, the style of trading, the time, location and frequency. There is always a question of privacy, but the reality is that app users demand that we use this intelligence in a way that supports their use of the app.

A user who places EUR/USD trades every morning between 08:00 to 08:30 on their commute to work can be offered tighter spreads, hedging options and even online vouchers for coffee at the shop close to their office – anything and everything is possible. It is about knowing the client and what will make them ‘stick’ to your app. It is very easy to get downloads of trading apps but as many firms are finding out, it is equally as easy for the app to be deleted.

The approach of highly targeted offers may be a concern to compliance officers but we are entering the eco space of the app user and will need to work to their rules as well as within financial regulations. There is a great need for trading to learn from other industries that have already spent millions of dollars working out how to be the smartest app on the smartphone. This is a difficult step for the trading industry to take – we are used to working in our own silo.

So we are entering a new world full of ways to offer better, simpler, more efficient trading to a rapidly expanding audience. Our industry has a lot of work to do, but by just looking at the latest generation of trading apps the advances being made are very exciting. By the end of 2016 we will be seeing some highly developed and resourceful solutions which will lead to the next expansion in mobile trading.

©BestExecution 2016[divider_to_top]

[divider_line]

Connectivity 2016 : James Maudslay

THE CLOUD CHALLENGE.

 

E013495

Equinix is best known for its network of data centres spanning 40 markets across five continents, but this network has allowed them to offer the only global interconnection platform. James Maudslay, Financial Services Expert – Vertical Marketing EMEA at EQUINIX explains why the cloud and what they have called ‘Interconnection Oriented Architecture’ will become an integral part of the financial services fabric.

First, what is Interconnection Oriented Architecture™?

Interconnection Oriented Architecture™, or IOA allows businesses to re-architect IT delivery to securely connect employees, partners and customers with what they need, in the right context, using the devices, channels and services they prefer. Such direct interconnectivity enables enterprises to react in real time, adapt quickly to change and leverage digital ecosystems to create new value and growth.

IOA is a completely flexible architecture that offers enhanced security, access to dedicated computing, cloud services – or a hybrid combination of the two – and allows application performance to be used as a driver for change, since new infrastructure needs to perform at least as well as any existing environment. It also has the flexibility to allow businesses to innovate, to share workloads yet communicate with all their partners, customers and suppliers easily and in a timely manner.

It also facilitates real cost control and, particularly pertinent for financial service firms, enables more effective compliance with regulation. There is no ‘lock in’ so firms also have the widest possible vendor choice, as well as the use of performance hubs and access to best of breed solutions and suppliers. Businesses can also deploy edge computing enabling them to take computing and storage resources to the location where they are needed to allow application performance close to the actual users.

What were the initial challenges?

Achieving buy-in was always going to be challenging given the non-physical nature of cloud. So the challenges for on-boarding included security, location of service, data protection issues, questions over suppliers, and questions over contracts. Also, as a heavily regulated industry, financial services are naturally more resistant due to regulatory concern. Yet IOA is a vehicle which helps overcome or improve many of these factors. Equinix also works in parallel with the Financial Conduct Authority (FCA) guidelines, which puts us in a good position to help our customers develop an infrastructure that works within those guidelines.

Could you outline the evolutionary uptake of cloud services by the financial services sector and what is the state of play now?

It would be too early to say that financial services businesses are using cloud in real anger, but having secure, performant access to a range of cloud service providers (CSPs) is allowing trading departments to begin developing solutions. This has been assisted by the fact that the FCA announcement on regulation around cloud services has indicated that there is no fundamental reason why cloud cannot now be used. As financial services firms start to become more familiar with it, we believe that they will gradually start to migrate more workloads to the cloud, all assisted by our ability to interconnect them to multiple cloud service providers, who in turn will start to specialise in the services that the financial services firms provide.

We are already seeing this with the growth of services such as Office 365, which is really beginning to gain some traction, moving licenses and storage models from the traditional capital expenditure (Capex) models onto an operational expenditure (Opex), pay as you use arrangement.

Having established a solid foundation in the electronic trading arena, can cloud services be extended to this sector?

Equinix’s foundations in electronic trading do not currently extend to use of cloud, but there is certainly huge potential for development of cloud in the design and testing of new products and systems as well as long-term data storage. The cloud can also move computing needs to the edge of a business’s network to bring data and computing closer to the users.

Other areas where we see expansion are in analytics, data recovery and business continuity. We also see it widening its access to private as well as public cloud, plus to connectivity that does not involve use of the public internet. It could also allow firms to propose hybrid solutions that can bridge the gap between traditional ‘on-premise’ infrastructure and contemporary cloud services.

How has financial services regulation changed the landscape and are there any particular pieces of legislation that are making more of an impact?

While my expertise is limited to EMEA, we all recognise that regulation differs between territories, but with IOA, due to its flexibility, Equinix can assist financial services firms in any jurisdiction where we have a presence.

In EMEA as elsewhere, regulation of the financial services industry has had a huge impact since the crash of 2008. Whilst it is rare for regulators to comment directly on IT matters, it can be safely assumed that any action which increases transparency, security and risk management will be encouraged by regulators. As a consequence, regulation does indeed have a significant impact on IT matters, typically in the arena of reporting and data storage.

However, there are currently three pieces of legislation/regulation that will have particular impact:

  1. Guidance consultation 15/6 – The proposed guidance for firms outsourcing to the ‘cloud’ and other third-party IT services – this outlines the FCA’s findings.
  2. General Data Protection Regulation – being introduced by 2018.
  3. MiFID II – agreed by EU members states but currently having the exact clauses developed (which is proving to be a very slow process).

What impact do the different regulatory jurisdictions have on the architecture you are building?

On the architecture itself, very little, as IOA is designed to deliver services across jurisdictions. The impact is felt more in the individual design of an IOA deployment to the individual business concerned. A good example would be that of a US-based company trading in Europe. The invalidation of the Safe-Harbour arrangements by EU courts means that IOA can be configured (or re-configured if necessary) to deliver the required services in Europe, whilst also meeting the business’s needs in the USA. Using IOA, the appropriate data can be secured as necessary within the EU with appropriate and compliant systems and procedures put in place to allow for the business’s reporting needs.

Another example would be that of a multi-national company trading across the ASEAN countries that has to comply with data sovereignty laws. IOA can be used to ensure that the appropriate and performant deployment of computing instances across the various countries ensures compliance with the different regulations.

Additionally, IOA allows businesses access to specialist suppliers and partners who will be territorially or regionally aware, and have focused their solutions on the individual needs of a locality or region.

How does the cloud, and fintech in general, help financial services firms meet the new demands and requirements of regulation and a more volatile trading environment?

Regulation will have certain general effects on business when it comes to IT matters. These will include increased reporting, process control, speed of response, security, reliability, data recovery and business continuity. The cloud and fintech are all designed to make responses to these easier – the cloud through flexibility of use (in terms of quantity of resource, performance levels and commitment, and therefore cost) whilst fintech provides solutions to business issues in terms of focused solutions, speed of delivery, ease of deployment and again, cost.

Combined together, the two solutions can respond to constantly changing situations, including new requirements proposed by regulators, backed up by a flexible computing resource that can respond to need, as, when and where required.

While there is a greater understanding of the cloud today, how are concerns such as security being addressed?

Financial services firms currently have six principal concerns about cloud today, all of which have solutions. The first is about the control of the infrastructure and whether they will have less scope to tailor their services. Interconnection capabilities will allow customer choice through the access of multiple clouds and tighter clauses over service delivery will help prevent cloud service providers (CSPs) from changing outsourcing arrangements without the financial institution’s knowledge.

The second issue is the risk of vendor lock-in when moving to the cloud and that short-term gains may be overshadowed by medium/long-term changes. This can be mitigated by a multi-cloud strategy that ensures gains can be maintained, and even allow for the use of different clouds for alternative purposes. Cmpetition should promote best-of-breed offerings but also ensure that long-term benefits do not erode and that innovation maintained.

The third area of focus revolves around data residency including whether it can be guaranteed to remain in the country or city as well as back-up and recovery for mission critical financial institutions’ systems. Solutions include using territorial/regionally aware suppliers through an extensive multi-cloud access capability for the former and a CSP that has an extensive network of suppliers and partners for the latter. Data security is also key which is why a combination of policies, with best in breed perimeter and data defences can be effective. Again, a wide variety of partners is crucial.

What are the remaining issues?

They include the need for ‘access to premises’ for their own due-diligence & audit requirements, and also for regulator access. This can be facilitated by open online visits and ensuring that security measures are provided both for computing as well as physical security. Financial institutions can work with their supplier of choice to ensure compliance or select suppliers based on their regulatory response.

Moreover, there are questions as to how to avoid systems outages during lock-down periods at certain times of the year. As most CSPs are industry agnostic, there may not be an appreciation of the ‘banking on-line day’ where systems cannot be interrupted. Expert CSPs who serve the sector can be on hand to identify and consume services in the appropriate time zone plus service level agreements can also prove helpful. In addition, the ability to allow simplified, secure and performant access to multiple cloud suppliers will meet the financial institution’s specific cloud infrastructure demands.

Finally, one of the biggest fears is the threat of cyber attacks and terrorism and whether suppliers can support high levels of disaster recovery as well as business continuity requirements. Multiple providers can address this either through their own services, or via the use of specialist partners.   Private connections also lessen intrusion risk by avoiding the internet. As for natural disaster risk, which is also high on the list, accessing cloud providers via diverse routes and service hubs will address this at a local and regional level, whilst the ability to employ geographically separated CSPs, or even the use of multiple CSPs will protect the actual computing and storage capability required.

What do you predict the future holds for cloud services and greater connectivity for the financial services sector?

The use of cloud is set to continue to a point where only some legacy systems that simply cannot be deployed otherwise will be left on dedicated hardware. This may take some time, but regardless of the exact timing, choice and ability to access multiple clouds simultaneously, securely and in a performant fashion will grow in importance, especially for regulated industries.

Regulation is likely to continue to develop in terms of its penetration, making everything discussed here ever more important for clients if they are to innovate and grow, and leverage regulation to their advantage rather than treating it as a necessary evil.

The ability to connect to services rapidly, as well as ensuring that applications are delivered to their users where and when they are needed will also increase in importance, whilst interconnection to business partners, service providers and of course a business’s staff and clients will become key to success.

This can only be delivered through the use of a true interconnection oriented architecture to underpin the IT service that a modern business will demand.

©BestExecution 2016[divider_to_top]

[divider_line]

Connectivity 2016 : Philippe Chambadal & Joseph Turso

A QUIET REVOLUTION.

E013495

By Philippe Chambadal, President, SmartStream & Joseph Turso, Product Manager, The SmartStream Reference Data Utility

Debates about connectivity often focus on the front office and the IT systems used by financial firms to trade across multiple asset classes. In the world of the back office, a quiet revolution is underway as sophisticated information technology software enables banks to break down traditional silos and carry out post-trade processing across asset classes, assisting firms to reduce the huge back office costs which currently burden the industry.

Advanced technology is also being used to power utilities, capable of providing shared services to multiple financial institutions. With banks increasingly putting competitive rivalries aside and turning to these mutualised services in a bid to cut overheads, technology appears to be driving not merely the erosion of walls between asset classes but removing some of the barriers between financial institutions themselves.

Trading instruments which span asset classifications, such as hybrid products, are becoming increasingly popular amongst financial institutions. Accessing the information required to trade these products creates many challenges for banks, as the data is very often stored in a number of discrete silos. The existence of these silos does not simply create headaches for firms looking to trade products which cross traditional asset class boundaries but raises some even more fundamental questions for the financial industry, especially in relation to the huge post-trade costs that this silo-based infrastructure engenders.

While financial markets were buoyant, maintaining a siloed infrastructure manned by large numbers of administrative staff was perfectly feasible. If a processing issue cropped up, banks simply employed more people to solve it. In the wake of the financial crisis of 2008, however, banks’ revenues have been hit hard, dwindling considerably – according to the Boston Consulting Group’s 2016 Global Capital Markets report, global investment banking revenues declined to $228 billion in 2015, down 5% from $239 billion in 2014 and 16% from $271 billion in 2010, with total revenue lower in 2015 than at any point since 2009.

Costs, on the other hand, remain persistently high with the financial industry spending – according to one DTCC report – as much as $100bn per year on post-trade processing. Investment banks, more than other financial institutions are feeling the pressure of high post-trade overheads. While many buy-side organisations have already hived off back office activities to custodians and other service providers, investment banks are still burdened by bloated, unwieldy back offices which are extremely costly to maintain. The toxic combination of shrinking revenues and burgeoning post-trade costs is now leading many banks to review the way in which they manage their post-trade activities.

New regulation, from European Market Infrastructure Regulation (EMIR) through to Basel III and the Dodd-Frank Act, is exerting further pressure on the industry to review post-trade practices. Financial authorities have introduced stringent reporting requirements and are demanding far greater levels of transparency than ever before. Increasingly, there is a desire on the part of regulators for an intra-day, real-time view into banks’ operations and the flow of payments and trades. For some banks, providing this level of up-to-the minute clarity into their operations could currently prove extremely challenging.

Automating processes and bridging asset classes in the post-trade world

One of the root causes of high post-trade costs is the fragmented and inadequate nature of banks’ processing systems. Many banks still have in place a large number of legacy IT applications, each of which performs a small part of a bank’s overall post-trade processing activities. Worryingly for financial institutions, these systems are far from joined up, slowing down processing times. The lack of interconnection between IT systems also makes it far more likely for errors to occur – all too often it is necessary for data to be re-entered, creating the danger that information is keyed in incorrectly. The fractured state of back office processing systems, and the lack of STP that this engenders, results in many broken trades, transactions and corporate actions. Putting right these breaks is a hugely costly business for financial institutions and – according to SmartStream’s estimates – banks may spend as much as $50 to $65 billion a year tackling the issues which arise as a result of weaknesses in post-trade systems.

Creating post-trade processing efficiencies is challenging as not only is there a lack of communication between many of the back office IT systems financial institutions use but because the banks themselves still inhabit a very silo-based world. The lack of connectivity between asset silos – and the negative impact it has for banks looking to improve the efficiency of their post-trade processing activities – can clearly be seen in relation to reconciliations management.

Traditionally, banks have carried out reconciliations by asset class – for example, for cash, exchange traded derivatives, OTC derivatives – in separate accounts. This has entailed using a different IT system, and sometimes Excel spreadsheets, for each asset class or deploying a single application but one in which each asset class is processed in separate accounts. There are a number of drawbacks to this approach, but two stand out in particular. Firstly, it is very difficult for a bank to get a clear picture of where it stands financially as it lacks a consolidated view into a total equity reconciliation. Secondly, determining that a reconciliation has been completed involves drawing information out from many separate systems, which represents a complex, time-consuming and expensive business.

Reconciliations processing is not the only area in which the silo-based nature of banks’ infrastructure obstructs the achievement of greater efficiency. Cash management is another sphere in which financial institutions encounter difficulties. To date, many large financial institutions have managed cash with end-of-day, manual processes and huge numbers of back office staff who gather data, in varied formats and from different systems, from individual business lines or regional accounts. This approach is, first of all, highly labour-intensive and costly. Secondly, there is no single, global view of activity across all currencies and accounts. Banks do not therefore have an accurate, up-to-date picture of balances and current risk balance, meaning that there is no clear picture of the risks they run. In addition, the absence of a real-time view of balances means that companies may be unaware of the availability of funds which could be used for other activities, for example, trading. As a result, a firm may run the danger of underutilising precious financial resources.

Clearly, rethinking inefficient, duplicative post-trade processes is a priority for the financial industry. So how can technology help them?

Banks have spent many millions investing in the large number of IT systems that currently power their post-trade processes. Yet these systems are often fragmented, resulting in inefficiency and high costs. Financial institutions require a means of bridging traditional asset silos but must be able to do so in a way which enables them to continue to make use of legacy IT systems and derive value from their existing investment in technology. Companies also need technology which allows them to break down the barriers between asset classes in order to create a real-time, centralised view of processing activities and outstanding positions. Achieving a high degree of clarity is vital if financial institutions are to be able to manage risk effectively and make full use of available funds.

To this end, SmartStream has developed its Transaction Lifecycle Management (TLM®) technology, which automates a wide range of post-trade processes. Developed as a single architecture, TLM’s approach is unique. It breaks down silos – tapping into legacy IT systems and exploiting existing technology – and creates one transaction lifecycle, from inception to settlement, taking a truly cross-asset approach to post-trade processing. Results of processing activities are presented in a single, consolidated view, providing companies with a clear picture of their financial standing, as well as the real-time view of the risks they run. The technology improves STP levels significantly and takes a highly pre-emptive approach to dealing with exceptions, enabling financial institutions to reduce trade failures and their associated costs.

Financial utilities: the future of post-trade processing?

At a time when financial institutions, and especially investment banks, face burdensome post-trade processing costs the industry – SmartStream believes – would benefit from not simply making use of advanced technology to break down the boundaries between asset classes but from setting aside some of the barriers between individual institutions in order to participate in financial utilities.

The use of utilities in the financial industry goes back to 2005 when organisations first began to build internal utilities to lower costs. Banks started to gravitate from the traditional model – where each line of business performed its own reconciliations and so on – creating, instead, a service able to perform processing activities on behalf of all a company’s business divisions.

During this early phase, SmartStream worked with a number of banks on the creation of internal utilities but the company also went on carry out several projects to construct externalised utilities. These latter undertakings highlighted external utilities’ ability to create a sizeable return on investment, often far better than that attainable through an internal utility. Gains of 30% to 50% proved possible – significantly higher than the 10% to 20% return on investment typically achievable through the use of an internal utility.

The unique structure of the externalised utility, SmartStream discovered, is particularly well suited to helping users drive down costs. Instead of making use of a number of different systems – as a traditional BPO service provider would typically do – a utility employs a single platform architecture. This approach has several benefits, for example, it solves the difficulty of inter-system messaging and it is also far easier to link new clients up to the service. In addition, the use of a common architecture facilitates the introduction of new business processes and makes carrying out adjustments necessitated by regulatory change far more straightforward. Any alterations are made by the utility on behalf of users, removing the need for clients to rebuild their own processes.

Financial institutions are turning to externalised utilities on an increasingly frequent basis. Reference data management is one area in which banks have begun to adopt a mutualised service approach.

The industry-wide burden of managing reference data

The financial industry currently spends a vast amount of money on managing reference data. A Tier 1 bank may spend as much as $50 – $100 million a year on processing reference data, also employing armies of back office staff to collect, validate, cleanse and reformat reference data before distributing it to front line systems. A large bank is likely to take in reference data from 100 to 120 sources, on a daily basis. Each of these information feeds requires a data loader, and operating and maintaining the necessary technology can be a complex, costly business.

In spite of the effort and investment expended, the industry is plagued by the ill effects of bad data and financial institutions spend a colossal amount on repairing broken trades caused by poor quality reference data. SmartStream research indicates that, post-trade, as much as $50 to $65 billion may be wasted annually through the impact of broken trades. Of that, some 30% – 40% – or $15bn – stems from bad reference data.

In the past, financial institutions have taken steps to get to grips with the massive amount of waste caused by poor reference data but these efforts have met with limited success. The lack of success is attributable to banks’ approach to achieving improvements in this area: traditionally, firms have viewed such projects as a unilateral activity, ignoring the fact that while an individual institution may spend a great deal on upgrading its own reference data, if a counterparty has poor data, a trade will break anyway. Forward-thinking institutions have started to realise the importance of moving away from this one-sided approach and are beginning to focus, instead, on ensuring that not just they, but their counterparties, too, use the same high quality information.

Managing reference data in-house, for a company’s sole use, is highly inefficient and costly. The utility approach to managing reference data – such as that pioneered by SmartStream through its Reference Data Utility (RDU) – can confer considerable advantages. Such a utility is able to act, in effect, as a processing agent for clients reference data, receiving information directly from clients’ data suppliers which it then cleans, cross references and enriches. Any improvements – for example, a correction to an error in the information received from a data vendor – benefits all clients of the utility. Once cleaned, data can be delivered to clients’ infrastructure. In the case of the RDU, it is delivered in the form of a single schema per asset class, which may also be customised, allowing users to consume whatever aspect of it they wish.

The RDU, which currently serves three Tier 1 banks, is able to save 30% – 40% on the direct operational costs of data management systems. As it improves the quality of reference data upstream, far fewer broken transactions result downstream, creating further cost savings. Technological complexity can be reduced, too. Instead of tens of data feeds users simply need one feed per asset class. The utility’s mechanism for delivering cleaned data also supports institutions engaged in trading hybrid products. Its distribution model is highly integrated across asset classes, with multi-asset send-up of data, enabling data to be sent in a way which shows links across asset classes. This, in turn, simplifies activities such as assessing credit risk in relation to a particular counterparty, removing the need to search for information across numerous, separate sources.

Importantly for the industry, the data cross-referencing carried out by the RDU allows the internal systems of participating firms to use a common data model and so “speak the same language”. As more banks and their clients link up with the utility this should allow the establishment of a de facto common reference data standard, helping to make the trade breaks that currently cause such high industry costs and provoke so much aggravation between banks and their buyside clients a thing of the past.

Looking to the future

The utility model offers banks a vital lifeline at a time when the sector faces a global rise in trade volumes, increased regulatory oversight, as well as stubbornly high post-trade costs. And it is not just reference data management which can benefit from adopting the utility approach. The bulk of post-trade services – most of which contribute nothing to a financial institution’s bottom line but which are carried out individually, and at great cost, by every company across the industry – are ripe for the same treatment and could benefit from financial utilities serving the sector as a whole.

In the future, increasing numbers of banks are likely to embrace this new approach. Lower revenues, coupled with the unsustainable expense of maintaining huge numbers of back office staff, will push financial institutions into accepting such a change. Technology may well, therefore, begin to break down barriers between financial institutions as organisations realise the value of the utility model and its ability to alleviate not only heavy post-trade costs but to create a leaner, agile and more efficient banking industry – as well as one which is far better suited to survive the rigours of the future.

©BestExecution 2016[divider_to_top]

[divider_line]

Mastering MiFID II: Implications for Execution in Investment Banking — Assessing the Cost of Compliance

Mastering MiFID II: Implications for Execution in Investment Banking — Assessing the Cost of Compliance

 

This report is the first in a series of GreySpark Partners reports examining buyside and sellside business and operation imperatives associated with compliance with the second iteration of the EU’s Markets in Financial Instrument’s Directive (MiFID II). The first report in this series explores the costs that investment bank trade execution franchises will incur as a result of compliance with MiFID II.

 

https://research.greyspark.com/2016/mastering-mifid-ii-implications-for-execution-in-investment-banking/

Fixed income trading focus : Overview : Dan Barnes

A WIDENING LIQUIDITY GAP.

As banks withdraw from fixed income markets and profits continue to fall, the buyside is actively hunting down liquidity. Dan Barnes takes stock of the situation.

Buyside firms are having to work harder to fill orders, with many investing heavily in technology and investigating new access models for the market. From the perspective of the trading desk it is clear that liquidity is reducing; banks are not making prices like they used to.

“Structurally the market is changing,” says Mike Lillard, chief investment officer at PGIM, an asset manager with $1trn in assets under management (AUM), and head of Prudential fixed income. “You are finding liquidity from different places within the markets and liquidity in the markets is lower. If you looked at the markets pre-crisis, relative to what it is today, it’s clearly lower that it used to be.”

The consensus amongst market participants is that the old way of doing business, with banks taking risk trades to support turnover, is not coming back. In Q1 2016, sellside revenues for fixed income, currencies and commodities were half of that seen in 2011, standing at $17.8bn according to analyst firm Coalition. Credit has been worst affected within the fixed income space, falling to $2.9bn in Q1 2016, from $7.2bn in Q1 2014. Headcount across Fixed Income, Currency and Commodities (FICC) fell by 11% over that same period.

Despite the reduction in staff, expenses are rising. Boston Consulting Group estimates that FICC sales and trading IT budgets have more than doubled since 2012 while regulation is offsetting the benefits of cost cutting programmes. The Fundamental Review of the Trading Book (FRTB) being conducted by the Basel Committee is intended to harmonise the approach to market risk. However the US based consultancy group believe the new rules could collectively increase the risk weighted-assets held on banks’ balance sheets by 28%, driving down the return on equity for investment banks from 5.7% to 3.4% against a cost of capital of between 10% to 15%. Dealers are working hard to fight against this tide say buyside firms.Be33-Web-14

Nicholas Greenland, managing director and head of Broker-Dealer Relationships for BNY Mellon Investment Management, which has $1.6tn AUM, says: “They continue to work closely with us and are trying to find ways of working even more closely with us. Part of this is being data driven as they look at the host of metrics by which they can analyse our clients’ business with them to understand if there are any areas that could be fine-tuned.”

Access all areas

Several dynamics are now in play that may create a chance to increase liquidity. The first is to mobilise buyside liquidity more effectively.

This will see the firms who hold the most assets – institutional investors – increasing their turnover and making prices in order to facilitate trading. Buyside firms will also have to react to moves in the market more quickly.

“We have always empowered our trading desks to do that so we have to be responsive to the market, we have to look for the opportunities when they present themselves – we take advantage of those,” says Lillard. “So to the extent you would say that makes us act more as a liquidity provider to the market absolutely, I am ready to buy and sell securities all the time.”

Be33-Web-15

Another needed change is the accessibility of markets. A point being discussed in the US treasuries and credit markets is the potential for investment managers to begin accessing interdealer markets. Currently these are restricted to sellside firms, proprietary trading firms and certain hedge funds. Gaining entry has long been a political issue for the market operators as dealers see this as disintermediation, yet traders have reported several plans by interdealer brokers to open up to buyside trading.

One option being explored in order to avoid the political problem is the use of a sponsored access model. In many exchange-driven markets, such as equities and futures, dealers provide a ‘pipe’ to the trading venue for clients, with certain pre-trade risk checks required to ensure the market rules are not broken. This allows the brokers to remain a conduit, albeit on a sub-agency basis.

Viable solutions

Existing market operators are delivering new ways to match buyers and sellers. Tradeweb has introduced actionable axes for credit, exchange traded funds and government bonds, with the axes being displayed on Tradeweb or sent to order management systems (OMSs). By 1 June 2016 the firm reported the number of clients using its axe functionality had increased by 131% since the start of the year.

“We see two common themes developing across fixed income markets in Europe; one theme is around the need to improve the exchange of information between the buy and sellside with respect to where trading can occur. We invested in a series of technological solutions that enable market makers to communicate to clients whether they are axed and which side,” says Enrico Bruni, head of Europe and Asia business, Tradeweb. “Secondly we have developed tools allowing clients to search axes across different types of securities in different markets, and direct requests for quote to a specific set of dealers.”

Be33-Web-16

MarketAxess has launched its Open Trading all-to-all protocols in the US and Europe, and in May 2016 reported that its global clients had traded a record of $944m of European credit products via Open Trading protocols in the first quarter of 2016, up 140% from the fourth quarter of 2015.

Numerous trading platforms have – and continue – to launch, in the credit and rates markets, from equity market operators such as Liquidnet, to start-ups such as Electronifie and Trumid. Traders are keen to see the struggle amongst platforms for liquidity is resolved.

Buyside trading desks are seeking tools that will enable them to find liquidity and measure performance, so they can participate more actively and suffer less when liquidity works against them. Trading opportunities are often fleeting and traders need to jump on them when they get the chance.

From an innovation perspective this creates two demands; the first is for venues and connectivity to venues that allow traders to get access where liquidity does occur. The second is for data – and data analysis tools – so that they can value opportunities accurately and manage their participation in the market as price makers.

“We built our transaction cost analysis (TCA) solution against our consistently available trading composite; we spend a lot of time monitoring how client transactions are executed relative to this composite to check its validity and precision, and we have used that as basis for TCA,” says Simon Maisey, managing director and global head of business development, at Tradeweb. “We can measure TCA against the composite both in basis point terms or cash terms, and as a percentage of the bid-offer spread to account for the different levels of liquidity in different products. You can get a really good comparison of how you are executing and what different parameters are influencing the kind of pricing level you are getting.”

Getting strong pre-trade and post-trade information is crucial to traders who have to switch between electronic and voice, with considerable uncertainty as to how the market will pan out. In Europe the MiFID II rules, coming into effect in 2018, will create greater pre- and post-trade transparency, beyond that provided by the TRACE system in the US. The US Treasury is discussing greater post-trade reporting for the US government bond market.

Firms like Algomi and B2Scan have developed tools in the pre-trade space specifically, and while uncertainty exists around platforms, traders will see the need for this information grow. Analyst house Greenwich Associates found that there was a reversion to voice trading in European corporate bond markets last year, with a fall in total volume of electronic trading from 50% in 2014 to 46% in 2015. Although the trend is towards electronification, flexibility on the trading desk will be crucial during the period of transition.

©BestExecution 2016[divider_to_top]

[divider_line]

Profile : Pauli Mortensen & Øyvind Schanke : Norges Bank

Øyvind Schanke
Øyvind Schanke, Norges Asset Management

TAKING THE LIQUIDITY REINS.Be33-Web-05

Pauli Mortensen, Head of Trading, Global and Øyvind Schanke, Chief Investment Officer, Asset Strategies at Norges Bank Investment Management believe asset managers should adopt a much more proactive role in today’s changing environment.

Pauli Mortensen has been Head of Trading, Fixed Income, globally at Norges Bank Investment Management, Oslo and London since October 2013. He has had many different roles since he joined the group in 2001 including Head of Trading, Fixed Income, Eurasia, Senior Trader, Treasury and Fixed Income Trading, Senior Portfolio Manager, Relative Value Strategies, Oslo. Mortensen started his career at NORDEA/UNIBANK in 1992 where he was an Economist/Analyst, Senior Dealer, foreign exchange and money markets and Chief Proprietary Trader, Treasury Markets. He has a masters in economics from the University of Copenhagen.

Øyvind Schanke has been Chief Investment Officer, Asset Strategies, London since 2014. He joined Norges since 2014. He joined the group in 2001 as Head of Trading, Oslo before becoming Global Head of Equity Trading, Oslo in 2008. Previously, Schanke was an Equity Sales/Sales Trader at Gjensidige Nor Equities and Sales Trader at Handelsbanken Markets, New York. He has a MBA Finance from the Norwegian School of Economics (NHH).

How do you see MiFID II impacting the fixed income sector and what changes do you think the industry should make?

Pauli Mortensen: I think MiFID II will be helpful. Today, we are trading with blinkers on for many securities. If you search for x or y in dark pools it can be cumbersome to agree on a clearing level because you don’t know where the fair value is. Post-trade transparency will be beneficial to the market.

Øyvind Schanke: I agree, but I think that MiFID II does not go far enough when it comes to post-trade transparency because there are too many bonds that are exempt from the regulations. In terms of the changes we would like to see in fixed income, we are supportive of a move towards an agency model alongside the traditional principal-based model. There is however, resistance from the largest banks, and not everyone on the buyside is pushing for this either. There is a view that because this is the way we have always done things, we should continue doing so. We do not believe that is a healthy approach.

What are the benefits of also having an agency model?

Pauli Mortensen: When it comes to pre-trade transparency, the agency model will improve liquidity in bond markets. In a principal world, under MiFID II, we will see wider bid-offer spreads because systematic internalisers will have to quote for other clients. However, with an agency model, there will be no intermediary and the buyside can post a bid-offer directly onto a platform.

Øyvind Schanke: There are around 30 or so different trading platforms on the market and they all want us to connect to them. However, the banks are in the middle and they should connect on our behalf. An agency model would be ideal for letting us plug directly into both lit and dark platforms.

Overall, what is your assessment of electronic trading and how do you see it evolving in the future?

Øyvind Schanke: When it comes to electronic trading, I think we are moving in the right direction. Over the last 10 to 15 years, Tradeweb, Bloomberg and MarketAxess have made it much more efficient to trade fixed income versus the traditional voice trading. However, it will be a long time before we see it work completely. This is because the buyside has not changed the way it executes trades. Request for quote is still the most popular method and firms are not using a type of central limit order book to post prices. Change is always difficult, and as the street used to provide a lot of liquidity there is the hope that it will return, but it is different now. It is ironic that issuance is higher than ever, but there is a lack of liquidity because banks are no longer able to warehouse risk due to stricter banking regulation. The question is how can we make all the liquidity points meet?Be33-Web-06

Pauli Mortensen: The buyside fundamentally needs to trade, but the sellside is now constrained so when it comes to electronic trading of fixed income I believe it will take off in an exponential manner. As long as there are vendors out there who provide the right technology then it is only matter of time before electronic trading attains critical mass. I also see platforms emerging where both the buy and sellside participate together. However, it is not just a technology issue, but also requires a behavioural change for the buyside. If they decide to wait for liquidity to come to them and not take an active approach, then they will be waiting for a long time.

What about the role of corporates?

Øyvind Schanke: I definitely think that corporates also have a bigger role to play. They should ask themselves if it is really necessary to issue so many non-standardised bonds that by default become illiquid. There are banks with thousands of outstanding bonds, and although they will argue it’s for maturity matching, they are not doing themselves any favours. Asset owners should care about this, as it leads to higher cost of capital. I think there needs to be a maximum number of bonds a company can issue.

What if any changes will Norges have to make?

Pauli Mortensen: We have facilitated the move to being more of a price maker as well as price taker and have given our traders more discretion in how they can trade with a short-term investment horizon according to pre-defined boundaries. To safeguard the long-term investment horizon each trader belongs to an investment team that includes portfolio managers and analysts, and we make sure to incentivise the trader to enhance the performance of the team’s portfolio. It is no longer just about his ability to execute, but also to provide market intelligence, develop ideas and help implement the portfolio managers’ ideas. The trader adds value by explaining what to focus on and how we can take advantage of dislocations in the market.Be33-Web-07

Øyvind Schanke: Yes, we separated the role of portfolio manager and trader in fixed income after 2008. We already had this separation in equities and what we saw as a natural barrier between the two in the fixed income world has now become more blurred. We decided we wanted to be more proactive, given liquidity has become such an issue, and changed the structure so that they could work more closely together.

What impact has the drop in oil prices had on your fixed income strategy?

Øyvind Schanke: The fall in oil prices has not changed our investment strategy. The income from dividends as well as coupons is large enough to offset the withdrawals from the government. We have a global approach which is still 60% invested in equities, 37% in fixed income and the rest in real estate. We also apply environmental, social and governance criteria across the entire portfolio regardless of the instruments.

What do you see as the biggest challenges as well as opportunities going forward?

Pauli Mortensen: I think the challenge can also be seen as an opportunity. As I said earlier, the buyside need to restructure the way they trade and become not only price takers, but also price makers. They underestimate how much value this can add to their investment decisions. My fear though is that they are so pre-occupied with regulation and requirements that they are not making this a top priority.

©BestExecution 2016[divider_to_top]

[divider_line]

Technology : Disruptive technology : Louise Rowland

HOT, COOL AND MAKING WAVES: DISRUPTIVE TECHNOLOGY RACES ON.

Fleet-footed fintech start-ups are rocking the boat, with UK firms showing the way. But what does it all mean for the financial services industry and its participants? Louise Rowland reports.

Funding in fintech is hitting new highs. According to a recent Accenture report, fintech start-ups globally received $5.3bn (£3.7bn) in funding in the first quarter of 2016 alone, an increase of 67% on the same period last year.

Why all the hype? Many things get swept into the fintech bucket, but the term essentially refers to technologies that facilitate things which hadn’t previously been conceived of or couldn’t be done before. Some argue that current developments are more evolutionary than revolutionary: that technologies such as encryption, distributed ledgers and smart contracts have been around for a while. The difference now is that these elements are being architectured together to enable exciting new business models to emerge.

Either way, the possibility of completely different ways of operating is good news for a financial services industry struggling with crippling costs, intense regulatory pressures, negative interest rates, weak revenues and depressed margins.

Setting the pace

Today’s most touted technologies are blockchain, machine learning, data analytics, robo advisers and artificial intelligence – most of them harnessing the capabilities of cloud technology. The retail end of the industry is setting the pace with innovations such as peer to peer lending, crowd-funding, digital banking and online robo advisers. Nutmeg, an online investment platform for small investors, and Estimize, a crowd-sourced financial estimates platform, both typify this breaking of the mould.Be33-Web-42

Should traditional incumbents be worried as some of these innovations start to move up the value chain? Michael Cooper, CTO Radianz, Global Banking and Financial Markets, BT, says the threat actually lies in not disrupting – in the industry not being willing to take advantage of the new developments. “There’s a lot of noise around fintech in almost all organisations now. Inevitably, some people are more willing to adopt it than others but there are not many players in New York and London, and indeed globally, who are not innovating.”

Chip off the new block

Some of the biggest noise is around blockchain – encryption-based distributed ledger technology – originally developed almost a decade ago to underpin digital currency bitcoin.

Advocates say blockchain could be transformative for the industry. It has the potential to revolutionise back office processes and slash costs, ripping out multiple data layers and reducing counterparty and systemic risk. It could also offer possible new ways for asset managers to interact with regulators – and each other.Be31_PeterRandall

Blockchain will save billions of dollars in processing charges, says Peter Randall, CEO of SETL, a new multi-asset, multi-currency institutional payment and settlement blockchain, which recently launched its blockchain-powered OpenCSD platform.

“Ledgers and reconciliation costs are cripplingly huge. The top 15 to 20 banks are spending a fortune on back office, with some running perhaps 250 separate systems which don’t talk to each other. The big project for this decade is the application of 21st century technology to the post-trade space. Blockchain is very deliverable, very clear, very good working technology that delivers with industrial capacity – an end-to-end system with real world value and capacity. The trick is to see that it’s not a technology project but an engagement project.”

It’s not for beginners, stresses Thorsten Peisl, CEO, Rise Financial Technologies which is developing a new post-trade blockchain to optimise the settlement of securities, safekeeping and servicing custodians. “Decentralised ledgers are very complex. New entrants need deep industry understanding and not just technical know-how. You can’t learn about the industry overnight. You need a combination of quality development resources and business people with the necessary expertise.”

Not everyone views blockchain as a universal panacea. Anders Kirkeby, Head of Technology Product Management at software vendor SimCorp, says: “I don’t think we’ll see blockchain widely used within the next few years, although it will likely play a significant role in the post-trade processing of certain asset classes such as illiquid assets. I don’t believe that blockchain will replace every intermediary in the next two years. It will be a far more evolutionary process.”

Data control

Firms across the industry are also leveraging the huge volumes of data they are now required to possess, allowing them to understand more about their business and end users than ever before.

Machine learning provides the opportunity to made predictions based on known properties of big data. Data analytics takes things even further, focusing on previously unknown properties of their data to enable firms to come up with totally new products and services. ‘It’s a bit of magic, but explainable magic,’ says one software vendor.

Artificial intelligence and robotics are also increasingly prevalent, poised to play a major role in an industry that is heavily algorithm-based.Be33-Web-45

It’s not just all about exploiting new technology, though, says Keith Saxton, Chairman of the Financial Services Programme at industry body techUK. “There are existing mature technologies in financial services that could also reduce costs, such as process technology and standardised system design. People haven’t adopted this technology enough, making the most of, and modernising, what they’ve already got. You could say, ‘how about starting off by doing this as well as everyone else in other industries?’ In reality, without taking this approach, it will be difficult to leverage new fintech capabilities.”

Hipster meets high finance

Disruptive technology may have started out challenging the established order. Yet, increasingly, the mood is one of collaboration and ‘co-opetition’ as the economic model changes and well-established industry players join forces with young fintech start-ups via hubs, incubators and accelerators. UBS’s initiative with Level 39 is one example of this. “Increasingly, no one has the answer to everything, so we’re seeing the disruption of the whole paradigm,” says one sellside participant.Be33-Web-44

Leda Glyptis, director at Sapient Global Markets, offers a word of warning. She argues huge sums are being spent across the fintech ecosystem in an attempt to stay up to speed with developments and competition, without firms always aligning activity to customers and relationship impacts. “Success in this space also needs intellectual investment. Firms should think about their strategic positioning, what they perceive their relevance to be, rather than just which technology to use to do what they do now, faster/ cheaper/ better. You’re not going to spend millions dismantling your infrastructure just to be a fraction faster. But you could invest millions if you’re going to deliver an incredible set of services to your customer and value to your shareholders.”

Keeping the regulators happyBe33-Web-46

One major subset of fintech is RegTech, which is exploiting key disruptive technologies to ensure regulatory compliance. Jon May, CEO of KYC.com, a joint venture powered by Markit and Genpact, explains: “RegTech is a real enabler behind the scenes. It’s bringing the ecosystem together for the regulatory and technical spaces on both the buyside and the sellside, taking non-differentiating processes and cost-mutualising them. Huge progress is being made, making it a win-win situation for all sides.”Be33-Web-48

Regulators are keeping a close eye on fintech developments. The Financial Stability Board, for example, is working on a study looking at systemic risk issues of the new innovations such as blockchain. Changes are afoot, agrees Emily Reid, partner at Hogan Lovells Financial Institutions Sector. “Most of the innovative products and services we’re seeing on the retail side are covered by the existing regulatory regime; the innovation is driven by technology which has transformed the design and delivery of these products and services. What is needed is regulatory change to allow for innovation in the design of documentation provided when selling and providing these products and services, so that financial service providers communicate better with their customers.”

“Some innovation may need to be specifically regulated,” adds Reid. “Blockchain has many potential applications, including how the ownership of assets is recorded and transferred. Time will tell how much of the existing regulatory framework is relevant to these applications.”

A watershed moment?

The pace of change driven by Fintech is fast and furious. In ten years, the landscape will have altered dramatically – although many believe changes are likely to be incremental rather than happening overnight. One observer predicts widespread M&A activity over the next few years, with the banking industry winnowed down and moving towards an agency model with increased bundling of services. The key is for firms to harness the new technology to disrupt their own business model and reimagine themselves digitally.Be33-Web-47

We’re in a Darwinian place again, says Terry Roche, head of Fintech research, TABB Group. “It’s all about the survival of the fittest – those who can gain control of their costs, who can transform their brittle monolithic infrastructure in a dynamic way and who have access to the broadest amount of data and understand it in real time will realise market advantages.”

©BestExecution 2016[divider_to_top]

[divider_line]

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA