What a lot can change in a year! Since the last FPL Canada conference, held in May 2008, Canada has been drawn into the liquidity crunch along with the rest of the world. Yet Canada has a risk and regulatory model that is different from many of its established trading partners, most notably the US and the UK. Can the world learn lessons from the Canadian experience?
With only weeks to go until Canada’s leading electronic trading event, it is still hard to pick what the credit crisis and regulatory environment will look like on June 1st, the first day of the conference. What we do know is that there will be increased regulatory involvement, particularly in areas that were previously not subject to scrutiny. In the run up to the event, we asked a range of experts to comment on what they feel will be the hot topics at this year’s event.
Conference Hot Topics
(A) Market Volatility
Market volatility has certainly changed trading patterns. The increasing reliance on electronic trading, leveraged through Direct Market Access/Algorithmic Trading or a portfolio trading desk, is directly connected to the growing need to manage risk, volatility and capital availability. We have seen a dramatic uptake in these services/tools, as there has not only been a focus on how electronic trading is conducted, but also on the expectations of trading costs involved versus benchmarks. The greatest impact has been a higher use of electronic trading strategies relative to more traditional trading and a shift in the types of electronic trading strategies employed.
From a single stock perspective, many traders were loath to execute at single, specific price points due to the potential for adverse percentage swings in the high single to double digits. Algorithms have been employed on a more frequent basis, to help traders participate throughout intervals on an intraday basis, managing risk around the volatility. The violent intraday swings also create significantly more opportunities in the long-short space. Quantitative execution tools have became more of a focus, to take advantage of these opportunities on an automated basis.
What the industry is saying:
“In terms of disruptions relating to market volatility, the Canadian trading infrastructure generally held up admirably. Most dealer, vendor, and marketplace systems handled the massive increases in message traffic and activity with little noticeable impact on performance. This is proof positive that the investment in capacity and competition was well worth it and is now paying dividends.” Matt Trudeau, Chi-X Canada
“Electronic platforms using FIX and algorithmic routers handled significant market fluctuations with no impact to performance. Speed to market for orders made it possible for traders to minimize exposure to huge swings in pricing and to capitalize on opportunity.” Tom Brown, RBC Asset Management
“Many traditional desks were shell-shocked and did not know how to respond to the volatility combined with the lack of capital. Electronic trading tools like algos enabled people to manage extremely volatile situations with great responsiveness. They were able to set the parameters for their trades and let the algo respond as market conditions warranted. Trading of baskets/lists made having electronic execution tools critical. You couldn’t possibly manage complex lists in real-time without very sophisticated electronic front-end trading tools.” Anne-Marie Ryan, AMR Associates
“The recent volatility spike means that risk will likely be scrutinized more in the future than in the past. Post-trade transaction cost measurement systems generally do not consider risk but instead focus on cost. To properly align the interests of the firm and the trader, performance measurement systems will need to reflect both cost and risk considerations.” Chris Sparrow, Liquidnet Canada
Jenny Tsouvalis of Ontario Municipal Employees Retirement System (OMERS)
sees a need for effective integration of investment management and trading processes. “On-line, real-time electronic trading systems provide quick access to liquidity and when coupled with real-time pricing embedded into blotters, identify the effect of market changes on the portfolios and the effect of trading decisions.
“Electronic trading has been successful because of its ability to be adaptive, so it is likely to change in reaction to current issues.” Randee Pavalow, Alpha Trading Systems.
“We’ve seen an increase in the use of algorithms and over the day orders as volatility has increased. The ability to smooth orders over a longer period limits the exposure to price swings during the day. VWAP, TWAP and percentage of volume, seem to be the algos of choice for many these days.” John Christofilos, Canaccord Capital
Masters of Risk – Can regulators and technology be the solution?
You can’t change the past, but to protect the future, NASDAQ OMX’s Brian O’Malley argues, the financial services industry needs to partner with the regulators to evaluate what went wrong, which market mechanisms worked and which did not. Best practices risk management controls, O’Malley argues, must be put in place, and followed.
The current financial crisis could be described as a disaster waiting to happen. At too many institutions the basic tenets of good long term risk management were being ignored in favour of short term profits. A lack of transparency meant many investment bank boards and executives did not fully understand the risk their firms were assuming with complex new instruments. As a result, they were not able to properly assess the return they were getting for them.
At the same time, trillions of dollars worth of risky OTC contracts were being traded and cleared bilaterally around the globe without proper risk management controls, and the regulators have been accused of being asleep at the wheel.
Opinions vary, but most observers agree on one thing. If regulated exchanges and clearing houses, with proven risk management practices, had a role in the OTC derivatives markets, the crisis would not have been as severe. Therefore, these organizations must be part of the solution.
Exchanges and clearing houses assume and manage counterparty and clearing risk. The members put up capital, and collateral is collected from the counterparties in the form of initial margin. The exchange measures and manages intra-day risk. When market volatility increases, a variation margin call is made. The positions of firms that cannot meet the margin call are liquidated. When the circumstances require it, the exchange can tap into the shared capital and the clearing corporation’s capital.
This system worked extremely well, even during the worst days of 2008. So the question is: should OTC contracts be migrated to exchange trading and central counterparty clearing?
There are precedents for central counterparty clearing in the OTC markets. CLS Bank operates the largest multi-currency cash settlement system, eliminating settlement risk for over half the world’s foreign exchange payment instructions. In the US, fixed-income marketplace, Fixed Income Clearing Corporation, processes more than US$3.7 trillion each day in US Government and mortgage-backed securities transactions.
For the last few years, the London Clearing House has operated a clearing facility for interest rate swaps. The International Derivatives Clearing Group (IDCG) recently started clearing and settling US dollar interest rate swap futures, and Liffe’s Bclear, which already clears OTC equity derivatives, started clearing credit default swaps (CDSs).
Role of risk-mitigating technologies
Creative use of technology can also play a big role in mitigating risk. For example, a large NASDAQ OMX technology client is leveraging technology to minimize pre-trade risk. Its customers wanted a system that could automatically validate a counterparty’s collateral and filter out bids and offers from unacceptable trading firms. In practice, this meant, traders could only see bids and offers from firms with whom they had open credit lines. Considering the number of versions of the order book that need to be sent out, the challenge was to create a system that was robust but not overly expensive or complicated.
The NASDAQ OMX solution was to allow the client to send an order book to all its customers in a single encrypted message. Users have a decryption key that determines which view of the order book they are permitted to see. The client has been using this technology in Europe to help handle pre-trade risk in securities lending.
Despite these examples, some Wall Street firms are resistant to the idea of trading OTC instruments on exchanges and clearing them centrally. They argue that OTC contracts are far more flexible than standardized, exchange-traded contracts. Bilateral trading allows them to retain anonymity. OTCs can be traded electronically or by voice, and typically the larger the deal, the more likely it is to be executed over the phone. Moreover, many OTC instruments are off balance sheet products. If they are centrally cleared, they would have to put up collateral, such as treasury bills, to meet margin calls, and these would be would be an on-balance-sheet item.
An on-balance-sheet item is an asset or liability that a firm formally owns or is legally responsible for. These items also are recorded as a profit or loss on the firm’s income statement. Futures, forwards and derivatives typically are not included on the balance sheet, and therefore they do not affect the firm’s liquidity and capital resources. The issue of off vs. on balance sheet items needs to be included in future risk management discussions.
The OTC market makers also have a vested interest in maintaining the status quo because the lack of transparency and price discovery works in their favor. Migrating these products to exchange trading and central counterparty clearing could reduce their revenue.
Introduced in November 2008, the Derivative Trading Integrity Act amends the Commodity Exchange Act to eliminate the distinction between “excluded” and “exempt” commodities and regulated, exchange-traded commodities, so futures contracts for all commodities would be treated the same. It also eliminates the statutory exclusion of swap transactions, and ends the U.S. Commodity Futures Trading Commission’s authority to exempt these transactions from being traded on a regulated board of trade. In effect, this means that all futures contracts must be traded on a designated contract market or a derivatives transaction execution facility. In addition, all swaps contracts fall under the definition of futures contracts and function basically in the same manner as futures contracts. While it is far from certain that this or a similar bill will pass, clearly the topic will be hotly debated.
At this stage, it is unclear how the OTC market will look in the future, but change is inevitable. The financial crisis highlighted the need for the type of real-time risk management that is an integral part of regulated exchanges and clearinghouses. These organizations can provide transparency where it does not exist today, mitigate counterparty and operational risk and reduce trading costs through the use of efficient technology.
At the same time, the basics of risk management haven’t changed. It is simply the application that has become sloppy. The financial institutions that remained in good shape through the financial crisis were the ones that maintained their risk management discipline and spread their risk around. A back to basics approach, when it comes to risk management, can be just the right medicine for an ailing financial services industry.
Top 10 best practices in risk management
The financial crisis would not have been as severe – or possibly could have been averted – if the best practices in risk management had been followed. Many key tenets were ignored in the interest of short-term gains. Going forward, here are the top 10 practices that, if followed, can help prevent future meltdowns.
- Recognize and manage the natural tension between P&L and risk mitigation; ensure there is a vibrant dialogue and balance.
- Be prepared to walk away from a deal or a product.
- Start at the top; ensure oversight and engagement of independent directors.
- Make individuals accountable, not committees; every risk has an owner.
- Look around the corner and anticipate risk even if the light is bad.
- If it is too good to be true from a margin perspective, it is, and doesn’t include the risk cost.
- Empower the internal audit department; ensure that the staff is capable and can think like business people.
- Engineer smart controls that are automated and woven into the systems and operations; challenge the control status quo as often as you do the business delivery.
- Leverage the experts in assessing risk and controls, know what you don’t know.
- Maintain transparency for the risk inventory, risk appetite and monitoring of the risk appetite.
Going Off-Market: Can Alternative Venues Give Solace in a Skittish Market?
By Yoichi Ishikawa
National exchanges worldwide have been pummeled by the recent economic turmoil and Japan’s Tokyo Stock Exchange (TSE) has been among the hard hit, both in terms of value and volume. The trend towards alternative trading venues, seen in the US and Europe over recent years, has spread to Asia, with Japan being among the first adopters. kabu.com’s Yoichi Ishikawa asks whether these new venues may prove a popular option in tumultuous times.
In the first two months of 2009, the stock market in Japan saw continuous low levels of trading and ongoing fears that liquidity would dry up permanently. Daily trading volumes on the TSE dipped below Y2 trillion. Adding to trading fears, the volume share traded on the exchange exceeded 90 percent of Japan’s entire domestic market. This concentration of trading on one exchange has, in fact, been the norm for many years in Japan, but concerns are growing a single exchange environment is curbing market efficiency, with transactions often not achieving best execution. Under these circumstances, several new and sophisticated proprietary trading systems (PTS’) have emerged for offmarket transactions in 2008. The goal, for these PTS’, has been to create an electronic trading method that allows the selection of the best price among multiple prices for a single issue.
Japan’s financial system has allowed for such off-market online transactions since 1998, and in 2001 two companies set up operations. However, investor interest was slow to materialise, as a lack of sophistication in early versions meant that prices could not be formed in the same manner as on the exchange. These system weaknesses kept the share of PTS transactions below two percent on offmarket trading between 2001 and 2006. A revision of the law in 2005 prompted a major shift allowing for price formation in the same manner as on the exchange. In response to this change and a perceived demand for extended hour trading, kabu.com was launched in 2006 offering night time trading.
By 2007, the share of PTS trading had reached about 4 percent. While growth was still slow, the stage had been set for the increasing popularity of PTS’. The following year, market share expanded to 5.5 percent, with a single-month record high of over 9 percent in November 2008.
Driving factors for growth
Soon after, two companies (including kabu.com) began daytime PTS trading using the same auction format as the exchange, enabling investors to do full-scale ‘offmarket’ trading in the same way as ‘on exchange’ trading. These companies also began transmitting market data to global information providers (such as Thomson Reuters), which allowed for real-time PTS market price information, including multiple quotations (depth).
The new PTS’ also allowed transactions at nominal prices, at smaller units than the exchange. Adding to the sophistication, transactions using advanced algorithms – mainly by foreign securities firms – opened up a new trading method that brought new liquidity to the PTS market. New players to the PTS market, (bringing the total number of operators to six, with the two new companies licensed focusing on individual investors), helped raise awareness of PTS’ among institutional and individual investors alike. Together the companies offered off-market trading to individual and/ or institutional investors. (See diagram A on next page.)
Trade by comparing multiple prices
Diagram B shows a typical kabu.com interface, tailored for individual investors. It allows users to see information on both kabu.com and the TSE. Market information tools, allow for share price comparison across multiple PTS’ and exchanges. With a number of traders offering identical prices on the Exchange, long queues to execute trades can occur. PTS’, on the other hand, has fewer traders, meaning shorter queues. They also allow for arbitrage trading at prices different from the Exchange, or even at the same time as the Exchange.
Equally, as PTS’ allow for smaller price units – one-tenth of those on the Exchange – it makes for easier algorithmic trading and breaking orders up into smaller trades. Together these advantages allow institutional investors to save on costs.
This sophisticated trading environment has allowed PTS’ to accommodate electronic trading by foreign securities firms, using advanced algorithms and other trading formats. This resulted, in our case, of an average trading volume per day of approx. Y1 billion between October 31 and December 31, 2008, representing an approx. 16 percent increase compared to the previous quarter, with a record high of Y4.7 billion set on October 7, 2008. Although, since October 2008, financial uncertainty resulted in a sluggish overall market, pushing down the number of orders, the number of contracts increased, making the contract rate jump from 0.6 percent to 1.0 percent (see figures following).
Sophistication of the offmarket e-marketplace
Looking back, it was the ability of PTS’ to trade at par with the exchange that drove technological advances and investor interest in the off-market trading format. The sellside enjoyed the advanced functionality that automatically sought out the right market and best price, allowing for best execution and forward orders. From 2009, these systems are available to foreign securities firms, many of whom are already familiar with alternative trading venues elsewhere in the world and Smart Order Routing (SOR) systems.
These SOR applications not only take into account price and cost considerations, but also allow for orders to be electronically and automatically forwarded to the most advantageous market, including exchanges, dark pools or PTS’.
Scheduled for launch in January 2010, the TSE is developing a new highspeed exchange system that takes into account the proliferation of PTS’ and the widespread adoption of electronic markets.
To be compatible, sell-side SOR systems will be expected to provide high-speed order execution capabilities. Also, with the sophistication and unit minimisation of best execution orders, transactions that allow for the trading of large orders using VWAP (volume weighted average price) by PTS’ look set to become commonplace in 2009.
In summary, the rapid development of PTS’ in Japan over the past year is driving the establishment of a new execution market that supplements the exchange both electronically and functionally, and looks set to reduce trading and liquidity risks across the broader market. In parallel, Japan will see the advanced automation of best execution for institutional investors, and continued adoption of SOR systems by the sell-side to allow for best execution.
Adapting your trading style – How the changing market landscape is driving new skills
Appetite for risk has never been lower and liquidity has never been tougher to identify. The downstream effect is impacting every part of the electronic trading business and culture. Quod Financial’s Ali Pichvai examines where he sees the sea change in trading skills and style.
We are at the midst of a structural, and subsequently cultural, change in the capital markets. Firms’ appetite for risk is shifting and counterparty risk is now high on the agenda. This trickles to each function and aspect of investment and execution; from investment decision making, to risk management, and the mechanics of the electronic trading. In addition, liquidity is increasingly fragmented across a multitude of pools and is affecting how electronic markets are evolving. So how does this landscape impact how firms will trade?
Changes on the buy-side and how the future will look are still uncertain. The hedge fund industry, the great innovator investor class, has in large part been discredited and its model will need to drastically change. The quasi-demise of this large segment will leave a void that needs to be filled. It seems that the future lies in more transparent, better risk-managed, low-cost listed products, which respond to the appetite of global multi-asset investment and execution strategy of the investors. Furthermore it is now clear that liquidity and solvency are intimately linked, and evaporating or volatile liquidity creates systemic risk on solvency. This will, without doubt, have a large impact on future capital market structures.
The buy-side transformation will inevitably accelerate the pace of the current secular trends of more electronic trading on centrally cleared liquidity venues and competing global or regional multi-asset liquidity venues. NYSE Euronext, as a global multi-asset liquidity venue, seems to be the role model for all other market participants. The liquidity fragmentation, as observed today, will certainly be greater and more complex going forward. It also seems we have entered a second age of liquidity fragmentation, with three phenomena which have appeared, or been reinforced, in the current turmoil.
Liquidity is becoming ever more dynamic. As competition increases price wars are becoming more frequent, and pricing models are being altered to attract more and more liquidity. For instance, the rebate model for passive orders (i.e. by resting a passive order, you can receive a fee) has often been used as an effective marketing tool for new alternative trading systems. Clients are therefore moving their execution on a real-time basis from venue to venue, as pricing evolves within a competitive landscape, making liquidity ever more dynamic.
Liquidity is decreasing transparency. As new dark pools and brokers internalisation profligate, with the US equities having achieved 17% of execution in these dark venues, the level of transparency is decreasing. This creates a massive trading challenge. Transparent liquidity is important since it creates an efficient price discovery model, which then disappears into a non-transparent execution model.As transparency decreases, in addition to market data sourced from the different displayed prices, there is a need to move to real-time post-trade analysis, to rebuild a more intelligent picture of liquidity.
Volatility increases fragmentation and increases execution risk. The current intraday volatility, and an even lower period volatility, is much greater than at any other time, and bigger than the impact of incurred costs. As seeking liquidity becomes more important, fragmentation will increase. The result is a new type of risk which needs to be mitigated; the execution risk. This means that the investment case can be fully redundant if the execution in a highly volatile market is not properly performed. This risk evolves from the inability to execute down to execution too far away from the investment decision. Another obvious effect is that the widespread algorithmic trading engines, which were built to limit for low volatility markets, have become obsolete. Nowadays, in an averagely volatile day, it is not uncommon to have 300 basis points of volatility, which dwarfs a single digit basis point cost impact. That means that the next cycle of investment in algorithmic trading needs to be redirected towards liquidity seeking algorithmic trading (also called smart order routing – arguably a misnomer, since it is simply routing rather than delivering a real-time decision making process).
It is therefore reasonable to hope that the shift will mean that, at last, fully supported multi-asset trading becomes a reality. The change is more than overdue. The vast majority of current investment strategies are already multi-asset or even cross-asset, but are not properly supported by the trading and execution vendor, broker-based or even exchange-class systems. The pressure to respond to this unfulfilled investor demand, to need to holistically manage risk (for any given client across different asset classes, without being too lax or too restrictive), and to reduce the cost of overall trading systems and infrastructure, are potent drivers to make this conscious shift happen.
The mainstream sentiment, driven by the backlash to the financial meltdown, is that capital markets and trading are beyond control and need to revert to a simpler model. This is a simplistic response. In reality while the level of financial innovation increased, technology investment did not keep pace, and this was particularly true post the mild recession of 2001. It is unrealistic to have expected increasingly complex concepts, such as liquidity fragmentation, risk associated with execution and multi-asset / cross-asset trading to be handled competently with the same old technology. If history is any guide, the future will likely be even more complex. So those investing intelligently in technology will be tomorrow’s winners.
Looking specifically at the execution world, there is a dire need to apply new technology which can manage the current degree of complexity and the need to repatriate execution making within the firm, instead of fully outsourcing it to liquidity venues or the sell-side players. The good news is that adaptive trading technologies are becoming both available and affordable to most market participants.
Adaptive trading technologies deliver real-time decision making mechanisms which rely on real-time market data and post-trade data. They direct and seek liquidity whilst fulfilling the original execution objective (or best execution policy). This allows it to handle complex scenarios, with multiple execution policies and numerous liquidity venues, on a multi-asset basis. Realistically, these technologies need to leverage the current trading investment in terms of infrastructure, upstream and downstream systems, such as risk management, or connectivity, and be as non-disruptive as possible.
There are, however, potentially major failure points. These include:
- Speed against complexity: There is a trade-off between the complexity of the decision making process (also called algorithm), and the effectiveness and speed of the algorithm. The right balance is usually empirically found on a case-by-case basis.
- Testing the machine requires a new approach: The classical quality assurance and testing approach does not apply to these very complex decision making processes. The testing combinatory permutations for just three to four liquidity venues and some real life client trading behaviour exceeds 100,000s of cases. The cost and length of such extensive testing is both excessive and impossible. The only way is therefore is to take a new approach which is based on statistical sampling of the test cases (and the art of sampling is itself difficult) to simulate as closely as possible a real production environment.
One key application of adaptive trading technologies is the ability to address decreasing transparency. That means real-time post-trade analysis of non-transparent liquidity. This information can be then complemented by market data (assuming that dark pools are usually trading at mid-price) and used as a basis on whether to directly execute on a given dark pool or not.
In conclusion, the changing market landscape demands a shift to adaptive trading technologies and that shift needs to happen now. Firms are still investing in technology, albeit strategically, and in order to keep ahead of their competitors in tough market conditions, firms need intelligent trading systems that can adapt to a constantly shifting environment.
The changing role of the broker – The global financial crisis has triggered a global revolution of the buy-side/ sell-side relationship
By Ofir Gefen
Is the financial turmoil of recent months driving change in the brokerage model faster than ever? ITG’s Ofir Gefen believes the role of the broker is changing in an irreversible way, and that geographic differences are becoming less relevant to the brokerage model as the more pressing concerns of high volatility and trading costs, counterparty risk and business ownership come to the fore.
Recent turmoil in the financial and investment banking industry at a global level is pushing sweeping changes to the brokerage model across the board. Despite fundamental differences in market structure and buy-side behaviour between the US electronic trading world and that of Asia, the financial crisis is now creating convergence at a quicker rate than anticipated. Both buy and sell-side are being forced to adapt, prioritise and focus on costs, risk and efficiency. As the role of the broker changes, market boundaries are deemed less relevant as trading costs, volatility and counterparty risk move centre stage.
Having spent many years working in both product development and sales in the US, when I moved to Asia I expected to encounter similar challenges to those we had already gone through in the US. I soon realised it is easy to underestimate the issues created by the diverse market structures, regulations, cultures and trading behaviours across the Asia Pacific markets. As such, I prepared myself for a relatively slow evolution before the buy-side here would use brokers in a similar way to their US counterparts. The US market was characterised by a highly diversified range of electronic trading tools, mechanisms and providers, where the market had evolved in response to a range of demands from the buy-side, but operated within a homogenised market structure.
In Asia the range of brokers and the services they offer have generally been less differentiated, with markets gradually opening up to DMA and the adoption of algorithms, crossing networks and alternative trading venues still in the comparatively early stages. A 2007 Greenwich Associates report focusing on the US market looked at the evolution of electronic trading in response to sophisticated segmentation of liquidity, while at the same time talk in Asia was more at the mechanical level of developing connectivity and adoption of electronic trading.
As we move towards the second quarter of 2009, similar reports have a very different tone. The December 2008 TABB group study ‘US Institutional Equity Trading: Crisis, Crossing and Competition’ states: “These are tumultuous times… Trading becomes a treacherous affair, with all previous metrics of volatility and transaction costs going right out the window. Both asset managers and brokerage firms are in uncharted waters. The implications for the relationship between the buy-side and sell-side are far reaching and many of the assumptions we hold about that relationship will be overturned”. This is as true for Asia as it is for the US.
Moving beyond old-style equity execution
The traditional value of the sell-side broker has often been intrinsically linked to the provision of stock research, access to IPOs and capital commitment. In an environment where markets are swinging wildly based on high level economic and social factors, IPO pipelines have dried up and capital commitment is now too risky to offer, the traditional sell-side brokerage model is called into question. This has been compounded by the collapse of Lehman Brothers, the acquisition of some large investment banks and the recategorisation of others to commercial banks. Some asset management firms and hedge funds are now stuck trying to reclaim commission credits from the bankrupted Lehmans and counterparty risk is no longer just the concern of compliance officers. Its effects are now directly relevant to the buy-side trading desks. Simply put, asset managers are needing to move beyond the old methods of equity execution and reduce their dependency on traditional counterparties, wherever in the world they are trading.
In Asia Pacific, the variety of market structures, the lack of any common regulatory framework and the logistical challenges buy-side trading desks have faced in establishing trading across the region have dominated the evolution of electronic trading. Traders have often experienced different challenges in finding liquidity than their US counterparts. Nevertheless, buy-side adoption of DMA, algorithms, dark liquidity and crossing has been steadily growing, helped by the commitment of firms with different types of brokerage models to bringing new electronic options to the region. In the past few years, not only have the investment banks invested significantly in developing electronic trading tools for the region, but innovative agency brokers, crossing networks and, imminently, MTF-like structures have brought new opportunities to the market.
ONE YEAR: FIVE CHANGES Adapting to the new trading environment in the EMEA region
The Oxford English Dictionary defines ‘challenge’ as the “move from one system or situation to another”. It is a word Toby Corballis, CEO of Rapid Addition, believes, all too easily describes the financial turmoil of the past six months. With a particular focus on EMEA, Corballis examines these challenges and the associate risks facing firms and software vendors across the financial marketplace over the coming 12 months.
Given the continuing volatility of global markets, large-scale movement of people, data and systems, challenge looks set to be the order of the day. Or, in the words of Robert Zimmerman, ‘The Times They Are A-Changing‘, in the world of electronic trading, like many others, ‘movement begets change and change begets risks’. Over the following year, the industry looks set to face a series of challenges, change and risk.
Change #1: Consolidation of software
With apologies to any egg-sucking grannies, once upon a time, to attract order flow, the sell-side gave away various software packages to the buy-side. It was the same kind of logic applied by proponents of the ubiquitous loyalty card, and buy-side firms saved money, as they didn’t have to pay for their software upfront. Actually, costs were partially hidden in sell-side fees and partially realised through the inconvenience of having to use a multitude of different solutions that did more-or-less the same thing but connected to different counterparties.
This may seem trivial, but it represents a seismic shift in the way systems make it to market. Money for these systems will no-longer flow from the sell-side to the software vendors. The relationship is now owned by the buy-side. No one wants to pay for something that used to be free, so it’s unsurprising that there are mutterings in the corridors. Expect this crescendo to peak as many existing ‘free’ contracts expire, creating a drive to consolidate on as few solutions as possible. This consolidation carries a number of risks to all sides of the financial Rubik cube.
Buying software requires an understanding of the software procurement process, a process that is itself currently the subject of change. Questions like, “how long has the vendor been trading?” are likely to matter less than “who is the ultimate parent?” and “how solvent are they?”. Ownership is likely to say much about the chances of software being able to support your requirements later on. Where safety in numbers was once seen as good (vendors had more clients so less chance of going under), it doesn’t work so well if you’re in a long queue of creditors to a failed business.
Knowing who the ultimate parent of a company is gives other insights. Do the company’s principals really understand my industry (they need to if you want some assurance that your system will change to keep up with the times)? Do they have a reputation for innovation? What is their commitment to Research & Development?
Change #2: Increased regulation as a by-product of political oratory
Politicians love to claim credit, deflect blame, and be perceived as tough guys. A quick scan of the local media is enough to see that the current political culture is very much one of blame. Fallout from political rhetoric tends to materialise in legislation which almost always beats the drum for “greater transparency and accountability”. What is actually meant, of course, is greater auditability, or the ability to recreate a moment in time so as to prove that the course of action taken was fair and in the best interest of the client. This is all good and there are many ways to achieve it, however it would be hard to argue that computer records were anything other than the most accessible of these.
Imagine trying to model all of the Credit Default Swaps (CDS) contracts into which Lehman’s entered. One problem is that a CDS could be created in so many ways: phone, instant messenger, and so on. Finding and recreating all of these would be a nightmare. That there is an appetite by the regulators for things to be more auditable is evident in recent speeches by Charlie McCreevy, the European Commissioner, who has been looking to introduce legislation to create more on-exchange trading of CDS trades.
If more assets are traded on-exchange that implies that more software connections are required to connect venues with participants. That, in turn, implies more trading platforms and more capacity to handle increased transactional volumes by the participants.
Change #3: Fragmentation of liquidity.
Recent times have seen an increase in the number of trading venues, with Mutual Trading Facilities (MTF’s) like Chi-X, Turquoise, and BATS entering the scene. Later this year the London Stock Exchange plans to launch Baikal, a pan-European dark liquidity trading venue.
I’m not going to argue whether fragmentation is a good thing or a bad thing, but it does provide several challenges. The first is in being able to find liquidity fragmented across multiple platforms, some of which are hidden. The second is being able to connect to, and speak the language of, many venues at once. Fragmentation in the financial marketplace is reflected in vendor routes to market. Increasingly, the connectivity method du jour is FIX. The London Stock Exchange recently announced that it will FIX-enable TradElect using FIX5.0, and Chi-X already uses FIX. This makes a great deal of sense as it lowers barriers to connecting with additional counterparties and leverages more than 17 years of a protocol that has had extensive industry peer review. The trick is to find a FIX engine that meets your needs.
Change #4: Aught for naught, and a penny change. Total cost of ownership.
Everyone likes something for free and when budgets are constrained there is a natural tendency towards cutting costs. There is, as we all know, no such thing as a free lunch. There is a distinct risk that people view open source software as exactly that. Whilst licences are free, the downstream costs of open source software are often overlooked when calculating the total cost of ownership.
Manuals, if they exist at all, are often a paid-for extra. Help with installation is an add-on and can be upwards of $2,000 per day. Maintenance? Another extra, and firms often need to employ their own technical staff. Updates cannot be guaranteed (a quick look at the website for one of the most popular open source FIX engines suggests it hasn’t been updated to support the most recent three versions of FIX), so this cost is often borne by the licensee. The ownership structure is sometimes not clear, adding the potential for future licensing issues.
Finally, there is no concept of liability with open source products, unless you specifically buy liability insurance, which these days is not cheap.
De-risking the cost of ownership requires that those making the purchasing decisions understand the costs upfront and can clearly see a return on investment within a reasonable time frame.
Change #5: Natural selection. The ISV landscape.
Software vendors will need to be flexible, in order to survive. That means being agile enough to adapt offerings quickly and bespoke solutions to their clients’ needs. We’ve all heard of large software providers refusing to make changes to software until they encounter the same request multiple times from multiple clients. That sort of behaviour is unlikely to survive the changing software landscape.
A lot of this behaviour occurred because making changes to software was time-consuming and costly. It required that developers open up the software, make changes, go through iterations of testing, and deploy to a large client base. Advances in software and associated methodologies mean that much of the logic can be defined outside the code. Business rules engines embrace this concept, as do data-driven FIX engines that can read in the FIX repository (or a derivative thereof) and instantly understand the logic for each transaction, as well as maintaining an edge on performance.
Change #5: The rise of social networks.
The appeal of Facebook and My Space is that they connect friends with each other across the globe. This has practical applications, of course, in connecting people with a common interest, but is very horizontally focused. However, more and more social networking sites aimed at specific verticals are starting to spring up. An example of one of these is Hedgehogs. net, a social website aimed at the Investment community. Sites like these incorporate vendor plug-ins (widgets) that offer users useful tools but the software vendor has to recognise the value of the platform, its viral marketing potential, and have software capable of seamless integration.
Many of the challenges faced in EMEA are, undoubtedly, global in nature. To ensure survival, companies need to look at the value of their IT investments, whilst vendors will need to offer flexibility at appropriate prices. It’s a big small world out there but the biggest risk, as always, is in doing nothing.
Staying Ahead – Advanced, yet flexible, technology key to network success
Exchanges are modernizing, market participants are investing heavily in technology and electronic trading is on the rise. The end result? Firms today face a challenging new market environment, one that is fractured, extremely competitive, and constantly evolving. Portware’s Damian Bierman explains why bringing in the experts could be the best investment you’ll ever make.
Today’s global trading landscape is vastly different than it was twenty years ago. Structural, economic and regulatory forces, combined with major advancements in trading technology, have ushered in a new era of automated trading. In the United States and Europe, numerous electronic market centers – exchanges, alternative trading systems (ATSs), electronic communication networks (ECNs), and crossing networks – compete for client order flow, offering superior execution speeds, reduced market impact, anonymity, or a combination of the three. New regulations in the US (RegNMS) and Europe (MiFID) have hastened the pace of structural and technological change. As for Asia and other emerging markets, they are not far behind.
At the same time, all market participants – buy-side firms in particular – are under increased pressure to reduce trading costs and rationalize their IT budgets. This is not a new trend. While the recent economic downturn is partly to blame, the goal of bringing increased efficiencies to the trading desk predates the global credit crisis. For firms in all global markets, the challenge is twofold: deploy advanced trading technology that will allow you to navigate – and exploit – today’s complex marketplace, but do so with an eye towards long term value and scalability.
New requirements for a complex market
What constitutes “advanced trading technology,” and what do firms really need to compete effectively in today’s market? To answer that question, consider the challenges posed by today’s market:
- Access to fragmented liquidity.
Firms today face an increasingly electronic global marketplace, and one that offers a wide choice of execution destinations. However, with increased choice comes increased responsibility for execution quality. In such an environment it is incumbent on market participants to take greater control of their order flow. From a trading standpoint, this means the ability to access all available sources of liquidity – including broker execution algorithms, crossing networks, exchanges, ECN’s and other pools of non-displayed liquidity – from a single trading environment.
- Advanced trading tools and analytics.
Given the complexity of the market, traders must have access to integrated toolsets with which they can make informed decisions. Pre-trade transaction cost analysis (TCA), real-time benchmarking and performance measurement, portfolio-level analytics and post-trade TCA – all of these help traders to efficiently gauge trading costs and route orders accordingly. Consolidating and integrating these toolsets into the trading process is critical. Not only does this drive efficiency, it allows for the analysis of orders at any point during the trade workflow cycle, helping firms gauge traders’ decision making processes (and overall performance).
- Performance.
For many firms,the recent spike in volatility has been a painful experience. Soaring trade volumes and related message traffic have crippled legacy trading systems. Unfortunately, these kinds of volumes are fast becoming the rule, not the exception. Technology that can withstand this kind of message traffic is no longer a luxury, but an absolute necessity.
- True multi-asset support.
The ability to trade multiple assets on a single platform offers numerous advantages to firms today. First, connecting data feeds and workflow applications to a single platform reduces integration costs and operational overhead. Second, multi asset systems can support advanced strategies such as auto hedging and multi-asset arbitrage, both of which are becoming increasingly popular.
- Flexibility and ease of integration.
Given the web of interconnected workflow applications, data feeds, and other third party and proprietary systems that firms have deployed, having a system that can easily interface with the rest of a firm’s technology infrastructure is another key requirement. The alternative – closed systems based on rigid technology architectures and proprietary languages – are difficult and prohibitively expensive to deploy. While the short term costs associated with deploying closed systems are considerable, the long-term costs can be enormous. The more difficult an application is to integrate, the harder it will be to upgrade it or customize it in the future. Eventually, such systems become part of a firm legacy infrastructure: out of date and costly to maintain, but too expensive to replace.
A Multitude of Risks – Multiple primes, multiple systems … is integrated technology the only solution?
The way we trade has never been more complex. The current environment is overloaded with the demands of multi-asset, multi-broker, multi-exchange, multi-market and multi-national strategies. FTEN’s Ted Myerson rakes through the coals of the current financial turmoil and offers his perspective on the best model for success.
Sponsored access comes with a multitude of parts and complexities that all have their consequential risks. Inherent to this type of business are multiple prime broker relationships, various trading systems, competing liquidity destinations and different asset classes traded across diverse global markets. The constantly evolving financial environment, amplified by the viral instability of financial institutions, changes in regulations, and the globalization of trading, just seems to proliferate more of it— more relationships, more systems, more markets, more regulations and oversight requirements.
This environment has become fundamentally more chaotic and consequently a lot more riskier. The sum of all of these components can lead to compartmentalization and blind-spots without the proper tools in place to aggregate all of the pieces. This threat of organizational disorder presents significant risks for both the buy-side and sell-side.
Technology that aggregates and consolidates all of these disparate components can mitigate the friction and chaos, providing a singular view into all of this activity. Because of the different interests of these distinctive, and sometimes competitive, components technology is truly the only way to control the multitude of risks associated with sponsored access in this multi-faceted environment.
The Multi-Prime Brokerage Model
The market turmoil, over the last year, has accelerated the demise of the single prime brokerage model. Hedge funds – big and small – are increasingly moving towards establishing multiple custodial relationships in an effort to avoid the fallout from a bank failure.
The need for hedge funds to spread their assets over multiple prime brokers, juxtaposed with the multiple accounts and trading strategies involved in their businesses, results in a myriad of complexities. Without the proper infrastructure, hedge funds can experience a “silo” view of realized and unrealized profit and loss, overall position, and exposure.
This presents a new set of risk management challenges for buy-side clients.
At the foundation of this issue is the need to collect disparate position and transaction data across multiple prime relationships. Broker neutral, agnostic technology that processes drop copies from all trading destinations and prime brokers and rolls them up into one dashboard, can help alleviate the disjointed view. Since hedge fund administrators typically only view data after the market close, this roll-up must occur in real-time and throughout the trading day for there to be complete electronic control over position and exposure.
Mulitple Trading Systems
As buy-side clients spread their assets and risk over multiple prime brokers, the number of trading systems also increases, compounding the disparate sources of data and the threat of creating silos and blind spots. Consequently, the risks of these blind spots are compounded by all of the staff and resource cutting measures currently affecting the financial services industry. There are simply not enough screens or people to watch them in today’s market. This trend is expected to continue into the near future.
Trading desks need an agnostic platform to consolidate the total exposure of committed capital across clients and accounts into one real-time, intra-day dashboard view.
Multiple Asset Classes in Multiple Markets
Further complexities arise as electronic trading strategies begin to embrace global securities, options and futures. Since this is a significant and growing trend, consideration must be given to different currencies, time zones, regulations and taxes when considering risk exposure and overall positions.
Technology that can aggregate the impact of these inter-market issues and calculate their net effect on portfolio is tantamount to risk management in firms that include multiple assets and geographical markets in their trading strategies.
ISO 20022 Financial Messaging Standard Neither a Tower of Babel nor an Esperanto
The ISO 20022 financial messaging standard began with the ambition of creating a common protocol for the entire financial services industry. After several years of effort, the standard has evolved into what promises to be a common business model that is flexible enough to incorporate current industry standard protocols, including FIX.
The story of the Tower of Babel tells of the effort to build a tower to the heavens. To stop the endeavour, God is said to have made each worker speak in their own tongue. Tellingly, without a common language, the tower never reached the heavens.
Esperanto, on the other hand, was invented in the late 1800s with the goal of creating a common international language. Today, while Esperanto still has devoted speakers around the world, it never gained – or maintained – popular support.
Efforts at developing a common financial messaging protocol have faced similar challenges: either too many different languages, or only one which has failed to gain widespread appeal.
An alternative approach – a middle road – is emerging within the International Organization for Standards (ISO). In June 2008, it was agreed that, where a strong business case was identified, the ISO 20022 financial messaging standard could support a domain specific syntax, such as FIX.
The relationship between ISO financial messaging standards and FIX is not new. Previous cooperation between SWIFT and FIX Protocol Ltd resulted in reverse engineering the FIX 4.3 pre-trade and trade messages into the ISO 20022 model. When concerns were raised that the industry perceived that the FIX message syntax would be rendered obsolete by ISO 20022 XML syntax, which was never the intention, FPL reduced its participation in the ISO 20022 process, but never fully abandoned the effort. With renewed and improved cooperation between FIX Protocol Ltd and SWIFT, the FIX 5.0 version underwent similar reverse engineering to ensure its on-going relevance within the ISO 20022 model.
XML and the proliferation of messaging standards
The evolution of messaging standards demonstrates the challenges faced by ISO in creating a messaging standard that included a syntax that was both robust and flexible enough to gain and maintain industry-wide support.
Created during the 1990s, and based on an early document markup standard SGML, the goal of XML was to create a format that could be used across a number of different applications. Its simple, self-describing format and assumed pervasiveness led to XML quickly being used as a messaging syntax by several financial organizations. In quick succession, several variations sprung up, including XBRL, FIXML, RIXML, MDDL and FpML.
In response to this growth and the onslaught on XML’s perceived dominance, a new initiative emerged, aimed at unifying various standards, (including emerging XML standards), into a common standard with a common XML syntax. It was to be known as ISO 20022.
One of the key drivers for this convergence within the securities banking segment came from the Global Straight Through Processing Association (GSTPA). GSTPA had planned to base its messaging standard on the planned ISO 20022 XML messaging standard. In the spirit of convergence and due to XML’s dominance, the FIX community, with SWIFT’s assistance, reverse engineered the pre-trade and trade messages from FIX 4.3 into the ISO 20022 model. The burst of the internet bubble coincided with the demise of GSTPA, while the information technology view that XML would be the only message syntax also waned, thus slowing momentum for convergence.
A standard for managing a messaging standard
ISO 20022 is not a traditional messaging standard. Rather it defines a standard for the processes for creation and maintenance of the standard. This is an important evolution and follows the incorporation of process methodology standardization within ISO, such as the ISO 9000 and ISO 14000 quality process standards, and the realization that message syntax itself will evolve over time.
The ISO 7775 standard was primarily concerned with the standardization of message syntax used on the SWIFT network. ISO 15022 based on ISO 7775 and SWIFT’s proprietary messages was an initial step away from this methodology and laid the groundwork for ISO 20022. As a result SWIFT largely dominated these standards. With the aim of unifying disparate standards across the financial services, ISO 20022 was officially launched and required a broader governance model.