With Michael Rude, COO REDI Global Technologies, Gary Stone, Chief Strategy Officer, Bloomberg Tradebook, and Eugene Finkelman, Executive Director, Electronic Client Solutions, J.P. Morgan
Michael: Regulators are increasingly focused on electronic trading systems, and are enforcing rules and regulations to govern the processes, procedures and controls to promote a more stable trading environment. Market participants are generally responding with a more rigorous software development lifecycle and with improved documentation and testing practices. The net effect of these changes could be a more controlled, compliant and stable marketplace.
That said, there is opportunity for improvement around go-live testing. Market participants who are releasing system changes to production are required to perform a pilot test to ensure the system is performing as expected. However, most exchanges do not allow ‘test’ orders in production, or any order that does not have an economic purpose. Absent the availability of test symbols in production, market participants are limited in their ability to validate market readiness before the market opens.
It’s not entirely clear why the exchanges resist the adoption of test symbols in production. In some cases, it’s because the addition of test symbols requires technology changes for the exchange. But it seems incongruous to have such extensive rules and regulations governing the development of electronic trading systems, but such limited tools to evidence the efficacy of our systems in a live environment.
Gary: I cannot understand why regulators are silent on this. There are regulations that we cannot test with live orders. Yet we are forced to because we have no other method of testing connectivity. Fragmented markets are also a real concern. Regulators need to standardise the test tickers across markets in a single asset class for Smart Order Routers to understand that they have connected and are sending appropriate orders IN PRODUCTION. Just because beta works, it is an assumption that it will work in production.
Origins
Michael: Whether it’s from an execution standpoint or from a clearing standpoint, it is universally viewed as a systemic risk if we can’t test systems across asset classes. There are certain test symbols that are available for use on NYSE for example, but such use is not permitted globally across asset classes.
While we are happy to acknowledge that it makes perfectly good sense to test your systems, at the moment, before we start rolling out into production we’re not allowed to do that last leg; we’re not allowed to certify in a live environment. Even the brokers aren’t allowed to send in “test orders”; they have to be real orders with real economic purposes behind them.
Eugene: Whether testing a change or validating production status, market participants need a safe way for interacting with production environment.
As part of the FPL Risk Mitigation Symbology Working Group, we are in discussions with one of the exchanges right now trying to connect two and two together: we want to test and a dedicated symbol to test will help in that effort. If we do testing, should it be a dedicated symbol or other random symbols? Not everyone sees the need to use test symbology just yet.
How will these work?
Eugene: The working group has put out requirements around the creation and expected behavior of these symbols.
At a high level, the key is for these instruments to mimic the exact behavior of any other instruments on that market, except these would be zero-funded no-risk securities.
They should be listed, have standard identifiers (Rics, ISINs, etc..) and should support standard order types in those markets.
They should be available for trading regular market hours in addition to pre and post market sessions where participants can route orders on these and receive standard electronic acknowledgements and executions.
Exchange incentives?
Eugene: This would help exchanges to make their environments safer, raise overall market confidence and establish consistent protocols for everyone to follow.
Achieving all this will provide an edge for early adapters as it would be an innovative and effective tool a liquidity venue can offer to clients in helping them address their risk concerns, which may in turn lead to increased flow and participation in such venues.
Michael: It’s possible that exchanges could have an economic incentive- for example, the Hong Kong Stock Exchange charges for the use of its test environment.
So the top level aim of the entire project would be that live testing symbols be used so that you don’t have to test orders live; which increases the stability of the orders that are going in, builds confidence in your own systems and also builds confidence in the market itself?
Eugene: The whole project is designed to provide the industry with a safe means of validating production environment, by establishing test symbology for all asset classes and recommended best practices around use of such symbology. This helps to establish a consistent and reliable business process thus reducing the risk behind production changes and increasing confidence for market participants.
Gary: It could be argued that Knight would not have happened if they could have linked to a test ticker and ran before using real tickers.
Why hasn’t this being done already?
Eugene: Participants need time, resources and money to make this happen. This could mean establishing processes to make sure that they have everything available to support this. For example, there’s not going to be natural liquidity out there so you need to be able to provide liquidity for these test symbols. It starts out with the exchange or whatever liquidity venue first. If at least we have one, two, three venues going then we can use this in turn to start having buy-sides, sell-sides or OMSs supporting this.
One of the points to be discussed is whether the exchange could set a threshold on their side: if there are concerns that someone could flood the system with a big volume of test orders they can set up thresholds to only take X number of tests. That should be part of the best practices around this, for the exchange to think through.
The best practices document is being scrutinised by one of the big exchanges right now. Best practices include recommendations for all market participants to use this test symbology for verification of all production changes whether implemented by buy-side, sell-side, OMS or the exchanges themselves. Verification of connectivity and systems are part of this. This could be used by firms in the morning as a start of day check to make sure that they are connected properly and that everything is working as expected. Then as a tool for post outage verification, where if an issue is encountered intra-day firms can send test orders to confirm everything is backup prior to resuming live trading.
If you create a test symbol that you can use to place an order, get an execution, and complete the allocation and clearing process – you can then truly test out every piece of the infrastructure within the lifecycle of the trade to ensure that everything is working as designed.
What’s next?
Eugene: The aim of the working group is to have test symbology across all asset classes; Starting with Phase1, and the current focus, to at least have these available for electronic order routing and execution on the major markets. With the ultimate goal of the whole project to be able to do the full end to end lifecycle, including allocations, clearing, et cetera.
The Case For Test Symbology
GlobalTrading Survey Findings: Big Data Trends in Electronic Trading Industry.
By Yulia Kuksina, Global Sales and Marketing, GlobalTrading.
In April 2014 the GlobalTrading journal conducted a survey of senior market participants with the objective of understanding recent trends in big data and analytics in electronic trading. Senior representatives from the buy-, sell-side, exchange, and vendor community contributed their opinions on the matter.
The key overall message from the survey was that automation is a growing trend on the trading desk. It was a unanimous reply from respondents. In addition all of the respondents supported the view that managing big data was either essential (50% of respondents) or important (another 50%).
The survey has also supported the recent trend of using big data for improved decision making processes with around 30% of respondents agreeing that it is a key business driver to use this type of technology at their firms. Regulatory compliance was the second most important driver of big data. Among other important drivers for implementing big data technology the respondents identified risk management and prevention and experience/historical analysis. Operational improvement topped the list of somewhat important business drivers for using big data, although it was the last on the list of most important priorities.
The survey has also looked at the top priorities associated with data management. Standardisation was identified as one of the top priorities. Surprisingly data protection did not score highly.
Spending Trends on Big Data, Data Anaytics and Cloud Technology
90% of respondents predict spending to increase. And the majority of respondents see use in the combination of outsourcing with building capability in-house as their likely future big data implementation strategy. The survey also supported the trend for the higher uptake of cloud technology in the industry with around 40% of respondents already using cloud and about 60% of respondents predicting spending on cloud technology to increase in the near future. Data protection remains one of the key concerns with regards to moving data storage to the cloud.
Over 80% of respondents are already using data analytic tools and 90% of those who do not are considering using them in the near future.
According to the survey, the usage of historical analytic tools in the electronic trading industry dominates the use of predictive by around 20%. 90% of respondents believe that historical analytics tools are more reliable than predictive ones.
Regardless of the growing buzz around social media, 70% of respondents do not see value in integrating data from social media.
Don’t mess with the US : Lynn Strongin Dodds
It is being called the revenge of the ‘soccer mom.’ After years of cheering their offspring on the touchline, the US has finally awoken to the joys of football. People of all ages were gripped by the better than expected performance of their team and the enthusiasm hasn’t abated despite being valiantly knocked out by Belgium.
Even the traders on Wall Street were transfixed and averted their eyes away from screens flashing numbers to television monitors in the room broadcasting the games. This may explain why 257m shares traded on US exchanges in the 30 minutes after the Germany versus US game’s start, about 6% less than the 12-day average before the World Cup, according to Bloomberg data. Overall though, and no connection has been drawn, the S&P 500 rose 1.4% since the tournament started on June 12, making it the best performer among the 16 countries that survived the first round of play.
While the US football team’s true grit against top European teams Portugal and Germany is still causing a stir, what is less surprising perhaps is the tenacity shown by the nation’s regulators against European banks. BNP Paribas, for example, had initially hoped to fend off a guilty plea by setting up an entirely new subsidiary to be the fall guy but prosecutors rebuffed the idea. The result was that after months of heated negotiations, the French bank paid a record fine of $8.97bn and pleaded guilty to processing billions of dollars of transactions on behalf of Sudan, Iran and Cuba that were in violation of US sanctions on those countries.
Although other banks such as HSBC and Standard Chartered have been impacted by the long arm of the US law, the size of the BNP Paribas fine sent a chill down the collective European banking spine. Many are also shivering over the latest salvo fired by Eric Schneiderman, the New York attorney-general, against Barclays, alleging its dark pool, Barclays LX, favoured high-speed traders while misleading institutional investors. Analysts believe that the probes could have repercussions for other European banks such as Credit Suisse which operates the largest alternative trading system as measured by trading volume, according to the latest data from the U.S.’s Financial Industry Regulatory Authority.
It is way too early to predict the fate of Barclay’s dark pool, but US lawsuits tend to be long drawn out affairs. However, as the BNP Paribas experience highlights, and as every ardent football fan knows, it is the hope that kills and expectations need to be managed.
Lynn Strongin Dodds
Shining A Light On Market Structure
With Anthony Godonis, Senior Equity Trader at Aberdeen Asset Management
I don’t think markets are broken, I just think they are very complicated. They’re complicated to the point where we really need to sit back and think about all of the intermediaries within the market, across the 13 exchanges, various pricing models and many dark pools.
There doesn’t appear to be much transparency in terms of understanding just what happens to your order. The recent conversation around IEX is encouraging as it has a lot of people asking the right questions. So, while I don’t think that markets are broken, I do think that they can become a lot more forthcoming in terms of how orders are routed, how algos choose certain venues, and why things are routed the way they are.
This is the first time that brokers find themselves just as frustrated by the lack of transparency as the buy-side, in regard to accessing liquidity. Brokers offer less risk because there is simply less volume available to offer. I think the issue is that they aren’t willing to take risk because they are experiencing the same situation as the buy-side. We would see a million offered in a given stock, but we’d realise that there’s no way to actually get that whole million. So brokers can’t make a market in that stock because they’re continuing to lose money due to the lack of actual available volume showing on the screen. It is the same situation even when we trade electronically. What we see on the bid or offer available, is simply not available when we try to interact with it. It disappears.
Typically the buy-side is not very vocal in terms of what they want to see; the brokers are normally our eyes and ears to the exchanges. But a positive result from this conversation will be unified communication in terms of talking to regulators and exchanges.
In terms of specifics, we have to ask why should some exchanges have inverted pricing models or order types, but not others? Some orders via routers/algos may go ahead and execute on a particular exchange, not necessarily to jump the queue, but to get involved; as they start crossing the spread there are now signals sent out to the market. Without uniform pricing and order types, there doesn’t seem to be a level playing field.
A big part of this inconsistency is driven by Reg NMS. The question is should we rely on brokers to do our routing? I don’t know. I think changing this would alter the relationship between the buy- and sell-side. For instance, if the buy-side starts taking on all the routing on their own, what role would the broker play in terms of helping us execute trades? It becomes about us just using their pipes.
So what is the role of the broker?
There will always be a need for someone to take on risk on both the buy-side and sell-side. I think that if things change to the point where the sell-side knows that they can actually get the published volume that they’re looking at, you might not just see more block trades occurring, but potentially more volume will go back to the exchanges as well. And I don’t think trading on the exchanges is a bad thing; you have much better price discovery there than you do somewhere else.
For instance, on some of the dark pools when there is a lag in time, we have to ask what the best way to measure them is. The more volume that stays in the exchange the better opportunity you’re going to have for larger volume trades. I don’t think that’s a bad thing.
This wider conversation is very encouraging for investors because it has ramifications globally. The US is supposed to be the most efficient market in the world, but this whole high frequency conversation having been exported globally, is affecting every asset class. It’s encouraging to know that there is now a really big focus on it.
What would be on your shopping list to change market structure?
Well, I think a lot of internalisers should attempt to force liquidity back on to the exchanges. I think that would be positive. When you can bring buyers and sellers together in one place, it’s better for price formation, and it’s better for volume. Even if you look at auctions: I don’t know many people that don’t agree with how the auction process works whether it’s countries that have VWAP auctions for 20 minutes or those that just have the Dutch auctions. I don’t necessarily think that’s a bad thing because everyone is on a level playing field. You put in your order, there’s price formation, there’s discussion and you’re not alone. Everyone is included.
A similar situation happens in Brazil, where there are certain parameters that drive an auction. There can be intraday auctions, and when everyone knows what those parameters are, everyone feels like they’re on a level playing field. Admittedly there is only one exchange, but if you trade over a certain amount of the outstanding shares, or the price moves more than a certain percent on a block trade, or if more than a certain percentage of moving volume gets traded; it goes to auction. It could be one minute, two minutes, it could be one day, two days but everyone knows what the rules are. And I think that we should be aiming to get to a point where everyone knows what the rules are explicitly. We have 13 different exchanges with different pricing and different order types. It’s over-complicated. The market is simplifying some of this on its own initiative: there’s been some consolidation amongst exchanges which I think will probably continue, especially if the rules change, which will probably result in more volume going back to the exchanges.
And the buy-side is getting much sharper as to where their money is going?
I think that the most encouraging part of the current situation is that the buy- and sell-side are equally encouraged to find out answers to get a better solution. There are certain aspects of executing trades where there’s a relationship involved and it definitely adds value. In this unique situation, the sell-side brokers are having as much difficulty executing as we are. We are all agreed; things need to be made more transparent and the rules need to be more explicit and less complicated.
The role of the buy-side trader is to execute trades with the least amount of impact possible. When you trade a block, it isn’t all about speed: you have a conversation, you negotiate and come to an understanding of what the price is going to be and why, what the volume is and you put it on the tape and move on.
It’s difficult when there are no blocks to be had; what’s the best answer other than trading the closing auction? You sit there and work something in the pipes which leads to all the noise and leakage from the market.
I don’t know what the best outcome would be, but it’s encouraging to know that it’s not just one buy-side shop or one sell-side shop out there deciding everything.
Whatever the answer is going to be, it’s going to be positive because it’s going to bring a lot of things to light; information, transparency, clarity. I think it will be very good for market structure.
Trading in Singapore Report
In the second of a series of whitepapers from Equinix and Kapronasia, the latest report looks at trading in Singapore, and its central role as a diverse and efficient trading hub in Asia, with an especially well developed OTC derivatives market, ranked 8th globally.
Building on the G20 Pittsburg Summit major OTC derivatives markets have been very busy implementing major reforms in centralized clearing. Singapore however was relatively slow to react to the changes, but rules are now clear and centralized clearing is in operation.
The other major story presented in the report is of increasing closeness between Singapore and China, with RQFII being rolled out to Singapore, allowing eligible institutional investors access to the quota system. This will only reinforce Singapore’s role as a major FX hub in the region, having overtaken Japan last year to make it the third-largest FX trading market in the world.
The report ties together the broad regional trends, including the internationalization of the RMB and changing trends with regards to FX trading across the region, and the demand for increasingly sophisticated derivatives products, to present the picture of an evolving and dynamic market that everyone needs to know more about.
Read the full version of ‘Trading in Singapore’ report here.
Market infrastructure : Exchange technology : Neil Ainger
BRANCHING OUT.
Neil Ainger looks at the emerging frontier of technology services.
The trend towards third party technology provision at emerging market exchanges is clear, with established bourses in developed markets offering ‘exchange-in-a-box’ or partnership deals. Frontier markets in Africa and elsewhere are also outsourcing their tech, offering their Western counterparts much needed revenue and emergent ones status, market access, compliance and experience.
“Why reinvent the wheel,” asks the CEO of the London Stock Exchange’s MillenniumIT technology services unit, Mack Gill (below), when discussing the rationale behind many emerging and frontier market exchanges’ decision to offer their technology to outside providers. “It’s far more efficient to leverage partnerships to acquire ‘best-in-class’ technology that is already developed … using scalable architectures, shared standards and existing tech is a win-win.”
It is viewpoint that Lars Ottersgard, head of NASDAQ OMX Market Technology agrees with. He notes that establishing new capital markets requires access to global best practice, technology and expertise around the legal, regulatory and trading framework to serve local market participants as well as attract international investors. Emerging markets don’t want to start from scratch. “We have partnerships with Borsa Istanbul in Turkey, the East Africa Exchange in Rwanda and the Kuwait Stock Exchange, among others, and have helped build these markets,” he says. “In the latter case the objective was to support the Kuwait exchange’s compliance with international standards and its contribution to economic development. NASDAQ OMX provided trading technology and strategic advisory services. This included the implementation of a new trading platform for equities, bonds and derivatives a few years ago. In addition, we oversaw the introduction of compliance and post-trade services.”
Other recent examples include the LSE’s Millennium IT unit using its tech expertise in South Africa, installing post-trade technology at the Singapore Exchange and modernising the Mongolian exchange. It also introduced a smart order routing solution in Peru and a central securities depository (CSD) project in Argentina. Third-party surveillance system installations are also common – sometimes in alignment with established exchanges, for example SMARTS which is now owned by NASDAQ OMX – as are index calculators and market data distribution systems in end-to-end installations.
Deutsche Börse is also a major player in this space with its advanced Xetra platform. It has recently bought a stake in the GMEX Group which targets emerging market trading clients, and signed a Memorandum of Understanding with the Stock Exchange of Thailand (SET) to facilitate the development of the securities and derivatives markets between Thailand and Germany.
SET is no stranger to collaboration with European firms, having chosen Cinnober, an independent provider of mission-critical solutions and services to major trading and clearing venues, to partner with the Thai stock exchange and replace its old platforms with a new multi-asset system (see “SET boosts its firepower”).
Veronica Augustsson, CEO of Cinnober, believes this project demonstrates the importance for emerging market exchanges’ to be flexible and not be tied to restrictive ‘exclusive’ partnerships that can hinder flexibility, create silos and have long-term negative impacts. Co-operation without lock-in is possible via the effective use of standards and service-orientated architecture (SOA)-like infrastructures that share common protocols, coding approaches and so forth. “I’m a believer in collaboration but it shouldn’t be exclusive,” she says, adding that while her firm sees established exchanges tech services’ units as rivals, there is no reason why they cannot also work together to benefit end users.
Historically, dedicated technology companies have brought innovation and new tech to the marketplace that established exchanges wouldn’t necessarily have done so, says Augustsson, citing the raft of new multilateral trading facilities that resulted from the EU’s first Markets in Financial Instruments Directive (MiFID) back in 2007 – and earlier from the RegNMS changes in the US – and the fact many of these new launches were subsequently acquired by traditional exchanges. “The LSE, for instance, acquired Turquoise and latterly Sri Lanka’s MillenniumIT precisely in order to overhaul its old tech offering. This illustrates how standalone tech companies can refresh the market.”
Off-the-shelf v partnerships
While retaining some flexibility in emerging or frontier market exchanges allows technology swap outs and upgrades to happen more easily in later years, it is not the only approach. “In Brazil, for instance, BM&F Bovespa has partnered with CME Group and they don’t sell an exchange-in-a-box type solution,” says Aite Group analyst, Danielle Tierney.
CME favours a traditional comprehensive build-out methodology which may require tweaks to make existing platforms better suited to local markets, and often requires large internal IT staff numbers, but it does at least have the advantage of control, consistency and clear lines of responsibility. “CME has a strategic partnership with Mexico’s exchange operator too and they personify this partnership approach, versus the exchange-in-a-box type of project, which say the Nigerian exchange is pursuing with NASDAQ OMX,” she adds. “Both approaches are valid and emergent exchanges must decide for themselves what is best.”
Whatever approach is adopted it is important for the end user emergent exchange to install comprehensive end-to-end technology that can cover trading, reporting and surveillance operations, with compliance, clearing links and so forth also available. Only then can international markets be accessed, audit trials established and new financial hubs created.
Benefits and trend drivers
The main benefits for emerging or frontier market clients is the access to high-tech financial hubs and liquid markets that can attract global investors quickly, drive economic growth and be created in a fraction of the time it would otherwise take. They also gain cachet, access to international markets, connectivity and tech know-how in an instant. For established exchanges they can plumb new revenue streams at a time of declining equity volumes and profit margins due to regulation, dark trading and electronic broking.
Mark Hemsley, CEO at BATS Chi-X Europe, believes that competition fostered by newer venues such as his own, is only escalating the third-party tech provision trend. “The cost of trading and clearing in Europe has plunged since 2008: it’s now 10 times cheaper post-MiFID to trade and up to 150 times cheaper to clear. This has dented the revenues incumbent exchanges derive from trading,” he says. “Selling technology to emerging market players that may not enjoy comparable budgets or the same technical experience is a means to plug the shortfall. In some cases, it’s also a strategic alliance that allows the cross-propagation of business and clients between exchanges.”
Hemsley maintains that his own firm is not interested in joining the third-party exchange technology providers. “We’ve had multiple requests to start a technology business. Our conclusion so far has been the same: we’d rather focus resources on completing the integration of BATS Chi-X Europe, and are now doing the same in the US with Direct Edge. This is a mammoth task, from which we don’t want distractions.”
Conclusions
Jim Northey, co-founder of the LaSalle Tech Group and co-chair of FIX Trading Community’s Americas Regional Committee, is understandably keen to stress the need for standardisation in this space to ensure cross-border connectivity, via the likes of FIX messaging, and to aid technology integration and market efficiency.
“The use of FIX is essential to the growth of frontier and emerging markets,” he says, citing the recent positive impact of the introduction of FIX in Nigeria. “It also aids on-boarding, and certification provides certainty and resilience for market participants.”
© BestExecution 2014
[divider_to_top]
Data management : Cloud solutions : Heather McKenzie
A MURKY PICTURE.
While no one refutes the benefits of the cloud, fears remain. Heather McKenzie reports.
Financial service firms may be increasingly adopting the cloud to manage their vast reams of data but there are still underlying concerns. It is not only the much talked about security of the information but also the stability of the cloud service provider.
In fact, in April, US-based international affairs organisation the Atlantic Council and global insurer Zurich issued a report warning of the implications of such a breakdown. A ‘Lehman moment’, as the report characterises it, will result in data being “there on Friday, and gone on Monday”. The report, Beyond data breaches: global interconnections of cyber risk, says such failures have the potential to ripple through the real economy leading to a collapse similar to that seen during the financial crisis of 2008.
The global aggregations of cyber risk are analogous to those risks that were overlooked in the US sub-prime mortgage market, according to the report. The problems in that market segment eventually spread far beyond the institutions that took the original risks and reverberated throughout the global economy. For example, banks and regulators before 2008 tended to assess risk as though every organisation was self-contained.
“In hindsight, it was folly to examine risks one organisation at a time, while ignoring the interconnections,” says the report. “Yet this is just how cyber risks are looked at today. Obviously the internet has been incredibly resilient (and generally safe) for the past few decades, but as with securitisation, the added complexity that has made cyberspace relatively risk-free can – and likely will – backfire. Cyber-risk management needs to look beyond the internal IT enterprise to other aggregations of risk, such as outsourcing and contractual agreements, supply chain, upstream infrastructure, and external shocks.”
Among the report’s recommendations to address the potential of widespread collapse is to borrow ideas from the finance sector. For example, internet governance should be expanded and fortified by the creation of a G20+20 Cyber Stability Board (like the Financial Stability Board). Also, globally significant internet organisations should be recognised and the idea of cloud service providers being “too big to fail” should be addressed.
Raising the alarm
Other red flags have been waved regarding cloud computing. The revelations by former US National Security Agency (NSA) contractor Edward Snowden that the Agency operates extensive global surveillance programs, mining data and allegedly engaging in industrial espionage have alarmed some companies.
A report issued by US-based research and education think tank, the Information Technology and Innovation Foundation, said the allegations would “likely have an immediate and lasting impact on the competitiveness of the US cloud computing industry if foreign customers decide the risks of storing data with a US company outweigh the benefits”. The 2001 Patriot Act and the 2008 Foreign Intelligence Surveillance Act have given US intelligence agencies the power to carry out mass information gathering.
Others have also suggested the NSA spying is a problem: in July 2013 Neelie Kroes, the then European commissioner for digital matters warned that European businesses were likely to abandon the services of US internet providers because of concerns about the security of their data. “If businesses or governments think they might be spied on, they will have less reason to trust cloud, and it will be cloud providers who ultimately miss out,” she said. “Why would you pay someone else to hold your commercial or other secrets if you suspect or know they are being shared against your wishes?”
Switzerland is hoping to take advantage of the NSA scandal by providing highly secure and private cloud services in much the same way that it provides private banking services. National telco Swisscom says its ‘Swiss Cloud’ can offer security from the prying eyes of other governments. This is, however, possibly only for data kept within Switzerland as once it moves across borders it is open to be accessed by other parties.
However, as financial institutions are required to collect, store, analyse and report on ever increasing volumes of data, cloud computing, or managed services, are attractive options. Market data clouds, for example, enable firms to bypass the sizeable investments in infrastructure, hardware, software and maintenance that are typically associated with traditional data feeds. Utilising this new type of market data solution helps firms gain economies of scale, lower costs and increase agility. A cloud-based market data solution allows firms to instantly enrich their regulatory and investor reporting with comprehensive coverage of real-time, historical and reference data, without maintaining an infrastructure to do so.
Still investing
Technology consultancy Ovum released research in October 2013 that found both buy and sellside firms were investing heavily in cloud services. Other financial market firms are also set to increase spending on IT infrastructure. Buyside firms were leading the way in cloud adoption, mainly because these firms are smaller and have limited budgets, says Ovum. Moreover, recent improvements to cloud security and a wider variety of applications are fuelling further growth.
Another consultancy, TABB Group, says as trading firms continue to offload commoditised activities, adoption of cloud offerings and other managed services will be among the leading manoeuvres to combat costs. In parallel with this adoption, “we see solid-state, multi-disk appliances and other high-performance computational solutions leading the production of analytics for better navigation of the market landscape,” says Paul Rowady, a principal at the firm.
Rowady says legal and security issues regarding cloud always will be top concerns for companies operating in the capital markets. “One of the issues that has come to our attention recently is that banking firms operating in particular countries may be required by law to keep the data within the country it pertains to. That would present significant impediments to cloud solutions that are multinational on a virtual basis,” he says. This is a challenge that would limit certain types of datasets being put into anything other than a private or hybrid (a private cloud operated by a third party) cloud offering.
Ralph Achkar, product director at market data company Marketprizm (above) says financial firms are becoming more concerned about the jurisdiction in which their data is held. They are asking questions such as how secure is my data, who has access to it and to whom does it belong? He agrees with Rowady that national data protection laws – Switzerland, Singapore and Poland for example require customer data to reside within their borders – need to be considered when looking at cloud solutions for data management.
When it comes to security, Neil Smyth marketing and technology director at portfolio analytics company Statpro (above), says a distinction should be made between “real” software as a service (SaaS) systems and what he calls ‘cloudwash’ systems. The latter are traditional applications given a new, web-based interface. While the interface may be secure, he argues that the back end database may not. Secure cloud solutions should operate secure back end databases and encrypt data. “Legacy applications that have been migrated to the cloud always add security as an afterthought,” he says.
“It wasn’t baked in from the beginning. Security in pure SaaS applications is paramount because of the multi-tenant features and the fact that they are designed to be internet facing.”
Firms concerned about cloud services and the legal and security issues should implement good data governance regimes, says Adam Cottingham, vice-president, data management services at IT company SmartStream. “The key to cloud and managed services is a governance structure that recognises the data being processed and is reviewed regularly in order to ensure that it is keeping pace with both global and national data regulations,” he says.
© BestExecution 2014
[divider_to_top]
Post-trade : Collateral management : Mary Bogan
THE COLLATERAL CONUNDRUM.
Mary Bogan looks at the uphill climb buyside firms are facing.
From the shaded anonymity of the back office, collateral management has emerged centre-stage to become one of the industry’s most hotly debated issues, taxing the brain power, business models and processes of not just banks and clearing brokers but custodians, clearing houses, central securities depositories (CSDs) as well as buyside institutions.
Propelling collateral management firmly into the spotlight has been a wide and complex raft of regulations, dominated largely by European Market Infrastructure Regulation (EMIR) in Europe and Dodd-Frank in the US. Under EMIR, parties to derivatives contracts are required to report every transaction to a trade repository and implement new risk management standards for all bilateral over-the-counter (OTC) contracts, including operational processes and margining. Standardised OTC derivatives contracts will be subject to mandatory clearing via a central counterparty (CCP).
While a firm deadline for mandatory clearing has yet to be set, and the clearing requirements for participants such as pension funds will not hit with full force for several years, operations could begin as early as late 2014 now that both NASDAQ OMX and EuroCCP have been given the green light to act as clearers.
“There is still a little way to go but the clock is now officially ticking on European mandatory clearing”, says David Field (right) of consultancy Rule Financial.
What the regulations mean for buyside players has been dawning slowly as the growing interconnectedness between institutions becomes increasingly apparent and relationships become more tightly embroiled. “If you lined up the myriad of regulations impacting on, say, asset managers over the next five years, it’s a much more complex road map than it first appears”, says Mark Higgins (left), European head of business development, for BNY Mellon’s global collateral services. “Asset managers are not insular beings. Regulations targeted at one part of the industry have knock-on consequences for other parts.”
Higgins points to OTC swaps and pension funds as an example. Although pension funds were given a three year exemption from mandatory clearing to allow them to take a view, many decided to act sooner, says Higgins, for fear prices of non-cleared trades will shoot up. “Prices charged by brokers for non-cleared trades may have to reflect the extra charges this incurs, so there could be indirect consequences for funds not in the first wave, which nobody thought about three years ago.”
What is clearly apparent though is that new regulations will undoubtedly increase the pressure to generate cash and the demand for eligible collateral. Far more transactions in more asset classes will have to be collateralised and margin calls will skyrocket. Market fragmentation will exacerbate the problem. “With the growth in the number of clearing houses, all requiring initial collateral and possibly dealing in multiple currencies, we estimate that margin calls are likely to increase between by five and ten fold, depending on client product deal portfolios”, says Mark Jennis (below left), head of collateral initiatives at US clearing and settlement institution, Depository Trust & Clearing Corp (DTCC). Fragmentation will also pose bigger problems in Europe than in the US, says Jennis, where markets are less integrated. Indeed several fund managers have publicly stated they expect to post far more initial and variation margin in three or four years’ time than they do today.
Warnings cautioning a potential shortage of collateral in global markets, however, appear to have been overplayed. The European Central Bank and Bank of England have estimated that new collateral required for the new regulatory environment will total between US$ 3.6 to US$ 6 trillion. Meanwhile the International Monetary Fund calculates that the supply of OECD government bonds swilling around the system tops US$ 41 trillion.
“On top of that, you’ve got about US$ 2.5 trillion worth of collateral held in securities lending programmes and the cash available as collateral in the repo market, which is anything from US$ 950 billion to US$ 1.2 trillion”, says Saheed Awan (right), global head of collateral services and securities financing products at Euroclear.
“The issue is not that there’s a collateral shortage. The issue is how do you mobilise all the collateral out there in a timely and efficient way”.
Field agrees. “Many clients we talk to recognise they have pools of collateral sitting in different geographies and business lines that are under-used. The problem is either they can’t see it or they don’t have the business processes or technology to enable its release.”
The ability to mobilise and optimise collateral is fast becoming the name of the game. And while the most agile banks and clearing brokers have begun to knock down the walls of business silos and invest in systems that automatically find and select their lowest cost collateral, the story on the buyside is generally one of all talk and no action.
“Not enough progress has been made on the buyside”, says Field. “Many are still attending conferences and figuring out what they need to do. Certainly talk of collateral optimisation for them is way too premature. Some have yet to get the basics in place.”
It’s only when institutions are able to centralise margin processing, track collateral allocations, monitor eligibility and haircuts and then substitute collateral to reduce the cost of funding, says Field, that they can start talking about full collateral optimisation. For the larger buyside institutions with complex portfolios, the answer to managing the non-core activity of collateral management will be outsourcing, although they will still need sophisticated systems to manage these relationships. The hope is though that the huge on-boarding crunch and accompanying mishaps that occurred in February when the deadline for reporting to trade repositories expired can be avoided.
“When it comes to mandatory clearing, I hope they’ve all kept the lights on and monitored developments”, says Higgins. “You don’t want to run for the door at the last minute because this change involves third parties. Clients need to pick their clearing brokers and build an understanding of clearing venues now.”
That’s easier said than done though. While the banks, custodians, CCPs and CSDs have found workable solutions to some of the big challenges posed by collateral management, hoary problems remain.
“Reporting is the big elephant in the room that no one wants to talk about”, says Philip Popple (right) a derivatives specialist at BNY Mellon. “BNY has solutions for pretty much every aspect of EMIR but this is one area we’ve yet to close out and our competitors face similar issues. None of us has a solution we are comfortable with”.
To date, reporting has proved a fraught affair. Many derivatives users have still to apply for their preliminary legal entity identifier (LEI), allowing them to report their over-the-counter and listed trades, and tracking is complex. “Because data is held in different places, tying it together has been a struggle for the industry”, says Popple.
The hidden costs
However, the main buyside concern is cost, hidden and explicit. “What pension funds and insurance companies tell me about what keeps them awake at night is the cost of transforming their portfolios into eligible collateral. Swapping corporate bonds and equities for low-yielding government bonds clearly has implications for performance”, says Euroclear’s Awan.
In addition, as financial institutions try to recoup the huge costs of establishing collateral management systems, incurred partly through over-estimating buyside enthusiasm for full-lifecycle collateral services, the temptation is to pass on their pain through the chain.
“The fundamental issue facing the industry in servicing the buyside is that costs have gone up enormously,” says Awan.
Clearing and settlement costs, he says, are not the issue. The services Euroclear offers in its new joint venture with DTCC, which gives the industry access to a huge pool of collateral and delivers it efficiently to where it needs to be, are charged at utility price levels, thanks to extensive automation. “But it’s pointless for us to be the cheapest point in the chain if, on either side, others are imposing substantial charges.”
If the industry tries to take advantage of mandatory clearing opportunities, warns Awan, the buyside will walk. “If all parties in the chain are not cognisant of buyside costs over the whole lifecycle of trades, we will all be losers. The buyside will simply walk away or find alternative ways to hedge. And that doesn’t just hurt the financial services industry. It hurts you and me and every citizen with insurance and a pension.”
© BestExecution 2014
[divider_to_top]
Technology : Outsourcing : Louise Rowland
THIRD-PARTY TIME.
Louise Rowland looks at the return of outsourcing.
We’ve been here before: growing numbers of financial firms calling on outsource providers to help them meet the challenges of the marketplace. However, today’s picture is different from five or six years ago, when cost reduction was a matter of life-and-death for many firms who lifted out their entire back office to third party administrators offshore just to stay afloat.
Offshoring is still a significant part of the picture but, with the market now healthy again, outsourcing is no longer just about saving money. Moving to a variable rather than fixed cost structure is, of course, a powerful incentive for companies seeking cost efficiencies in their operating model. Yet many firms are also now relying on the expertise of third parties to help them negotiate the growing thicket of regulation facing the financial services industry. In many cases, a large well-established outsourcer will have better security and compliance measures in place than the client itself.
Then there’s the role outsource providers can play in helping firms grow their business. It’s all about being leaner, meaner, smarter and more intelligent. By off-loading non-core activities, clients are able to focus on innovation and exploiting new markets, whether that’s developing an existing product, moving into a new asset class or entering a different geographical area. Outsourcing companies possess the technological knowledge and skills to help drive this process. Taking advantage of a scalable service allows firms to get to market much more quickly.
Changes for the better
“The COO of a bank is facing increasing calls from the front office to lower the costs per unit, as well as the demand to innovate and deal with regulatory pressure, all against a background of reduced margins and trading costs,” says Tom Carey, President, Global Technology and Outsourcing, EMEA and Asia at Broadridge (Above). “Banks can’t make all this work by themselves alone and are coming to the realisation they need to adapt. This is an opportunity to respond and transform themselves.”
Underpinning this transformation are the rapid developments in technology, particularly with the arrival of the Cloud, which allows infinite amounts of data to be stored securely off-site. Nothing could be further from the image of large teams of back office staff number-crunching in a dimly-lit basement.
The great hand-over
The UK is ahead of the game in outsourcing and, perhaps not surprisingly, buyside firms tend to be further down the adoption curve. Most smaller and mid-sized asset managers don’t have the budget or the appetite to maintain a large IT infrastructure in house and have been outsourcing process-intensive back office activities for years, including custody and fund accounting, settlement, record maintenance, confirmation and clearance and regulatory compliance.
There’s also been a steady move up the value chain, with many middle office functions now out with third parties, such as trade matching and books of records. Dave Reynolds, principal at Investit (Above), says, “People are moving their middle office functions less on a cost basis now, but because they’re after best practice, flexibility, knowledge and skills. Providers are bolting on additional services such as client reporting, performance measurement and collateral management, so that we’re seeing an expansion of the outsourcing landscape in terms of depth.”
Different firms, inevitably, have different comfort levels. For some, handing-over a single component of their business and retaining the rest of the workflow is change enough. Others, particularly hedge funds or boutique firms, are happy with a full outsourcing package as it allows them to focus entirely on their core activities.
Sharing the load
It’s all about ceding a degree of control, something that traditionally doesn’t come as easily to the sellside. Yet that’s also changing. There are no more sacred cows in a climate where it’s all about costs and balance sheets, tough regulatory pressures and a hunger for market differentiation. So while some sellside firms might want to keep, say, data aggregation in-house, they are increasingly relaxed about outsourcing other pockets of their operations, such as EMIR legislation.
One striking innovation is the emergence of the utility model for internal data management practice. Banks have always operated mini silos of data for numerous clients – a cumbersome, duplicative and expensive way of working, given these are mostly commoditised activities. Many are now exploring mutualised solutions in the form of industry standard reference data utility models, set to improve data quality, mitigate risk and reduce costs, in some cases by hundreds of millions of dollars per bank.
“It’s a dramatic shift in mindset,” says Steve Grob, director of group strategy, Fidessa. “We’re starting to see an increasing number of banks and broker dealers leave their guns outside the door and collaborate on infrastructure where it is purely duplicative and adds no competitive advantage.”
Daniel Simpson, managing director and head of enterprise software, Markit EDM (Above), prefers the term ‘managed service model’ to ‘utility model’. “It’s not a one-size fits-all-solution, but bespoke according to individual firm’s needs. There’s lots of commonality in terms of scrubbing and integrating data but clients also want bespoke output. That’s where the value lies. Many people have tried to build utilities in the past and failed. The time is now.”
Partners to rely on
So what makes a trusted outsourcing partner? It’s all about providing the right comfort level, says John Lehner, head of BNY Mellon’s outsourcing business and president and CEO of Eagle Investment Systems. “When we engage with clients, they want to validate how we conform to their requirements in terms of operational readiness, stability and data security. In the past, you might have been providing a single piece of the front, middle or back office picture. Now the client’s decision to partner up with you rests on what you are doing to deliver a fully integrated service ecosystem or platform. Our offering has to be preconfigured to integrate with other people’s solutions.”
Large custodians have long been offering peace of mind, adds Toby Glaysher, head of global fund services, EMEA, Northern Trust: “Overall, outsourcing is a good thing from a risk perspective. It’s better for asset managers, say, to outsource work to specialists with the technology, scale, global participation and business continuity that we can offer. With trillions of dollars of assets under administration, we bring lots of capability and ‘oomph’ so that de-risks the asset management industry.”
The question of risk feeds into the whole area of regulation. The spotlight has never been more intense on areas such as data retention and data keeping, with regulators requesting large amounts of granular information. Once again, this is where outsourcing comes into its own. Many firms don’t have the internal infrastructure to meet this level of compliance – far more cost effective to use a third party than make a major investment in new IT systems. Outsourcing also gives banks the opportunity to measure unit costs far more easily.
Compliance remains a two-way process, however. All client firms retain a fiduciary responsibility and the FCA is calling for stringent measures on oversight and resilience, including a robust exit plan and contingency arrangements should things go wrong. For some asset managers, this level of monitoring even calls into question the decision to outsource activities in the first place.
Something for everyone
‘Running your mess for less’ has been replaced by a realisation that outsourcing can deliver sophisticated business benefits. This is a permanent change, say most participants. Undertaking it on a large scale is not for everyone, particularly on the sellside, where many firms prefer to work with their own platform rather than embarking on the complex, lengthy process of migrating to a provider’s platform.
Nevertheless, outsourcing is here to say. The City has always been adept at adapting; this is a way of working that offers numerous advantages, however participants pick and mix from the banquet of solutions on offer.
© BestExecution 2014
[divider_to_top]
Buyside challenges : Block trading : Mark Goodman
BLOCK AND TACKLE.
Mark Goodman, head of European quantitative electronic services at Société Générale discusses solutions to navigate a crowded marketplace.
According to recent research from TABB Group, buyside firms have continued to value block trading in equities. However there has been a significant gap between what could trade as a block, the amount the buyside wants to trade in blocks and the actual block numbers – a gap that has persisted despite a number of efforts to “crack the code”, each with inherent limits. Do you agree with this and what other problems do you think there are?
Whilst trading blocks is not the only challenge facing the buyside it is certainly one of the topics we find is raised most frequently. This is partly a function of the changing market structure but it also seems to be self-perpetuating: if everyone else is splitting their trades into smaller and smaller sizes then you are going to follow the crowd rather than post a block.
So whilst the market structure is important there is a behavioural issue here. If what one wants is blocks why do you divide your orders into small sizes, whether explicitly when submitting to dark pools through an ‘aggregator-type’ solution, or implicitly when sending it to a broker’s algorithm?
The overwhelming response was that as the buyside trader does not know who might be trading against them in a particular venue, the right thing to do is to be cautious. In short, the lack of certainty in terms of outcome means people shy away from committing themselves in block size.
The TABB report noted that the industry needs to resolve multiple requirements often at odds with one another, including: enabling trusted counterparties and allowing for anonymity; sourcing liquidity and paying for services; quick-matching engines and seamless workflow integration; and, marrying electronic messaging with traditional sales trading. Does PartnerY provide a solution and how does it address other issues?
The key area we have tried to address in terms of the above is how to enable trusted counterparties yet still allow for anonymity. The issue around an uncertain outcome stems very much from not knowing who is there. PartnerY – by allowing clients to construct a group of counterparties in a pool with whom they wish to interact – ensures that the potential outcome of a cross is much more predictable. However, we retain the notion that at the point of execution the counterparty remains anonymous. Whilst you know who you exposed the order to out of a universe of selected counterparties, you do not know specifically who you crossed your order with at the time of execution. So you retain anonymity at the point of trade.
To some extent this becomes similar to working with a sales trader in order to source a block for you. The sales trader by using their knowledge of your objectives and of the holdings and objectives of potential counterparts will selectively try to match buyers with sellers whilst trying to ensure that anonymity is maintained. PartnerY offers a complementary electronic solution to this more traditional service.
In terms of sourcing liquidity and paying for services we see a very clear approach from the buyside. Even bundled clients will step outside their broker list where a service or provider can really add value in terms of differentiated liquidity.
Some buyside firms have become disillusioned with information leakage and falling order fill sizes in (some) dark pools although they like the anonymity. Do you see PartnerY as a substitute or add on?
We see PartnerY as an addition to the choices already available to the buyside when sourcing liquidity from dark pools. Whilst we do not believe that all liquidity is good all of the time, all liquidity is good some of the time.
If we look at what is already available in the market we would agree that the majority offer trade sizes equivalent to that in ‘lit’ venues and there is clear evidence from our own analysis of information leakage. However, if your order contains significant short-term alpha, and immediacy of execution overrides concerns around signalling risk then you would be happy to interact with these venues.
PartnerY, by nature of the fact that users will limit the number of counterparties they expose their orders to, offers less immediacy. But equally it can offer an environment into which clients feel comfortable in committing large size. And that means it occupies a different end of the spectrum in terms of immediacy versus quality than a traditional dark pool.
In Europe generally market participants trade in small- and mid-cap stocks with the average order size around 55% of average daily volume (ADV) on a given stock. How does that compare with the wider market?
This has very much been the profile of the liquidity that we have seen in PartnerY. It is not something we dictated. It is a result of where the users have positioned PartnerY indicating that a venue, which can handle this type of stock and size, is what they were missing. Typically we have found that the platform is being used much more for small and mid-cap stocks rather than large caps.
What has been the reaction since the November launch and what are your plans for future development?
We feel we have really captured the attention of clients with the concept of passing control from the venue operator to the end user so that they can construct the venue(s) they want. We have a strong pipeline of mainly large long-only asset managers and are currently just north of thirty clients – adding probably around 3 to 4 new parties a month.
Clearly we need to turn this enthusiasm into material cross rates. This is challenging when you are focused on organic growth of participants and natural liquidity. However, fundamentally we do not want to try to accelerate this by adding market-making activity as this would effectively drive PartnerY towards the same scenario as existing offerings in the market, which is simply not differentiated.
This is a more challenging approach and requires patience. Yet we believe the end result is worth it and our existing partners have been very vocal in encouraging us to see this through. So our focus currently is on how to better assist our clients concentrate liquidity to capture crossing opportunities and we will continue to experiment with the offering in order to achieve that.
You say that a unique element of this service is that you are transferring control from the venue operator to the end user. How so?
We allow each participant to select those other partners they want to interact with and allow them to police their list – including or excluding – on demand. Our role is to operate the service and provide data that allows participants to make informed decisions on managing their group. Data reports are provided to clients to allow them to identify partners whose behaviour does not match their desired objectives – including quality indicators around reversion performance as well as frequency/size of trades and average resting time.
Future development is also managed in this way. Whilst we have some ideas we are very active in ensuring that we consult with existing participants and only make changes when we find consensus. Even more encouraging has been that we see examples of participants discussing issues directly with each other then coming to us with joint proposals. This is very much in the spirit of the approach and we believe differs from the more traditional approach of designing finished products and subsequently pushing them out to potential users.
That being the case, what control do you exercise and where is this likely to lead?
Outside of our obligations to manage the platform according to regulatory requirements we do not exercise any control over the type of participants or who should interact with whom. We offer the service to all clients of the firm without prejudice and equally offer all participants the option to interact, or not, at their discretion. We believe this is the best approach to ensuring that the evolution of PartnerY remains in line with the evolution of user needs – rather than short-term market share or revenue wins.
[Biography]
Mark Goodman joined SG CIB in 2010 as head of European Quantitative Electronic Services. He started his career as part of the cash equity team at SBC Warburg in 1997 (later UBS), and began specialising in electronic trading in 2001. Following several years in the London team with particular responsibility for growing the hedge fund business, he moved to the US in 2006 to establish the European service with domestic clients. Goodman returned to London in 2008 as head of European electronic sales before leaving UBS in 2009.
©BestExecution 2014
[divider_to_top]