In late March GlobalTrading hosted a roundtable in Kuala Lumpur to examine the best practices for electronic trading in Malaysia. Attended by Bursa Malaysia, the Securities Commission Malaysia, and a range of local buy-side and sell-side firms and with attendance from regional FIX Trading Community members, the event focused on the changing nature of electronic trading in the region, how Malaysian firms can capitalise on regulatory changes in Singapore and Hong Kong, and new technology being made available to the region.
With electronic flow already between 30-40%, Malaysia is firmly established in the electronic trading arena but, when compared to developed markets in the region which see 70-80% electronic trading volume, there is clearly still a lot of development potential. New technology will be at the forefront of such development, as increasing foreign flows look to access Malaysia’s markets.
One key alignment needs to be made by domestic firms with their regional and international partners, as flows increasingly move both from Malaysia throughout the region, and as funds look to enter Malaysia; facilitating standardisation of technology and increased electronic capabilities are essential to allow this increase in business. This needs to happen in both the front office, with clean reliable market data and proper risk controls, and the middle and back offices.
Reform needs to be driven within firms as well, as systems need to reflect changing market practices around STP and risk control. A general trend across the financial services industry is that each market participant is much more responsible for their own trading and technology – the burden is moving onto the buy-side and vendors, and not just sitting with the brokers, and this needs careful management within firms to ensure that pre-trade risk is managed correctly.
One thread throughout the panel discussion was that change takes time and effort – no firm can change institutional habit and technology overnight, so firms need to make sure that they are prepared for future challenges before they arise. In a market like Malaysia that means balancing the local environment and specific nature of the retail/institutional mix and foreign flows with regional and global trends.
Kuala Lumpur: The Internal And External Drive For Reform
By Tim Healy, Global Marketing and Communications Manager, FIX Trading Community
By Tim Healy, Global Marketing and Communications Manager, FIX Trading Community
The FIX Trading Community’s Working Groups and Committees have had a busy Q1, continuing the work they have been doing in 2013. A number of existing initiatives continue to be worked on whilst new ones have been kicked off. Addressing new regulation is still very much a focus for the groups, however, a number of initiatives are also concentrating on the streamlining of various different processes involved within the pre-trade, at-trade and post-trade work flows. Below is a brief summary of some of the highlights of the work the Community is focusing on.
Global Technical Committee
The Committee has been focused on the next generation of the FIX Protocol. This release will be more than just an update to previous versions of the Protocol but is the final framework that will allow combinations of all supported FIX standards and versions. The vision for FIX 6 is that FIX 6 engines will be able to natively talk FIX 4.2, FIX 4.4, FIX 5.0 SP2 with those counterparties that have not upgraded yet and talk FIX 6 with those that have. The Global Technical leadership is currently brainstorming how best to take this work forward.
TCA Working Group
At the beginning of the year, the TCA Working Group completed the Best Practice guidelines for TCA for Cash Equities. Since then the co-chairs have reached out to the FIX Community for volunteers to expand the document for Currencies, Futures and Options and Fixed Income. There has been an excellent response from the Community and three new sub committees will soon be formed. Additionally, the Group would like to encourage widespread adoption of the guidelines and is talking internally on the best way to do this and will propose a mechanism for self-certification.
OTC Products Committee
In the Fixed Income space, the current initiative is to introduce ITCH semantics into FIX. As some of the more liquid fixed income markets start to exhibit characteristics suitable for HFT, there has been a demand to incorporate some of the features of other established industry protocols into FIX. Therefore, we are working on creating a set of extensions to FIX that bring the best elements of the ITCH protocol into FIX. The work is focused on fixed income venues, but will be applicable beyond Fixed Income.
In Foreign Exchange, there are discussions with the industry to create best practices for the exchange of counterparty credit limit information on inter-dealer FX venues such as EBS and Reuters. The idea is to enable automated kill-switches from FX risk/trading platforms to inter-dealer venues, allow integration of internal credit/risk platforms with inter-dealer venues and, ultimately, lower risk and increase controls around counter-party exposure management for FX.
High Performance Working Group
The High Performance Working Group is looking at adding high performance capability to FIX. The approach is to look at different aspects: new binary encodings to minimize message size and latency, new session layer focused on high performance requirements, application layer updates to minimize semantic verbosity. So far, a few application layer enhancements have been approved through the Global Technical Committee, three different binary encodings are available for public review (draft standard for ASN.1 encoding, second iteration for Google Protocol Buffer and Simple Binary Encoding) and the first iteration of a session layer proposal is being finalized within the working group.
Post Trade Working Group
Having successfully updated and reissued the guidelines for Equities in 2013, it is clear from many industry participants that FIX is now the cornerstone for a rapidly increasing volume of post trade workflow. It is the preferred model of many buy-side firms to adopt a workflow that connects them directly to their brokers. With this capability now available, some buy-sides are now processing more than 50% of their equity trades using FIX. Members of the working group now seek the opportunity to expand the usage across other buy-side firms and across other areas of their businesses, and subgroups of the working group are now working on documenting and agreeing the standards for four other asset classes including Fixed Income, Derivatives, Equity Swaps and FX.
Market Model Typology (MMT) Working Group
The Market Model Typology (MMT) initiative aims at standardizing the content of individual trade messages across equity markets in Europe. This means making sure that trades of similar nature get flagged the same way irrespective of who is the data producer (trading and reporting venues).
MMT Technical Committee (TC) members are in charge of maintaining the MMT data model that is designed to cover all trade types in Europe. Clear and unambiguous definitions are documented for all trade types to allow industrywide consistent implementation. In addition, MMT TC members produce comprehensive tables mapping existing proprietary data feed logic against the new MMT standard.
Some coordination work is performed as well in order to support the implementation of MMT codes in native data feeds operated by trading venues and their subsequent downstream integration in feed and display products operated by vendors.
Global Buy Side Working Groups
Toward the end of last year, the distinct regional (Asia Pac, EMEA and US) working groups combined to form the Global Buy-Side Committee as many of the themes discussed regionally had global implications. As an example, the Execution Venue Initiative comprises a number of aspects including capturing order routing data and a review of the existing best practices documentation with a focus on Tags 29, 30 and 851. The Committee has been working with global exchanges to discuss the addition of test symbology to the environments. US Regulators have also shown interest in the concept of a production test symbol to mitigate risk when new code or processes are introduced. Additionally, there is an initiative to achieve better transparency/risk management of the IPO process. There has been much interest generated as of late to standardize Fixed Income FIX messaging and there is an on-going review process to see where the main issues lie with a focus on the Buy Side work flow. Finally, given the increased focus on the Buy Side regulations in the electronic trading pace there have been discussions about creating a Buy Side Regulatory sub Committee.
All of the groups offer a neutral industry forum encouraging open discussion and participation from all member firm representatives. If you would like to know more or are keen to get involved, please feel free to e-mail us at fix@fixtrading.org.
Circuit Breakers And Hong Kong Regulation
By Kent Rossiter, Head of Regional Asia-Pacific Trading, Allianz Global Investors
What would you like to see from circuit breakers in Hong Kong? Dynamic, static, market level, instrument level etc.
I’d like exchange circuit breakers to kick in at the single stock level to avoid unwarranted sharp moves which are more often the result of a fat-finger or input error than an intentional action. There may be actual cases where stocks move sharply on actual news, but even in these cases it’s not unusual for the stocks to overshoot because the quote liquidity isn’t heavy enough to take the one-sided and skewed orders. I’m not saying I recommend the Taiwan model of randomised batch-auctions about every ten seconds or so as opposed to continuous trading as done in most exchanges, but it’s a solution worth consideration and backed by some academic studies.
When a stock encounters a sudden sharp move, a cooling off period, say for a few minutes where the stock locks at a price before being allowed to move a few percentage points further, may be a good idea. There’s some debate to determining the right ‘time out’ for traders to pull orders, change limits, or adjust strategies, but it’s probably better to have some rational human thought to the trading instead of it being driven solely by the fastest automated computers in the blink of a second.
Implementing a mechanism so all stocks have the ability to reach clearing prices on exceptional news or in the minutes running up to the market close is a challenge. The eventual solution will probably be dynamic and may use the overall market’s level as a reference point.
How do regional variations affect you?
There’s little conformity among the Asian exchanges now and I don’t think they are going to make effort on circuit breaker uniformity. This is fine with me.
What should be the key considerations for the exchange?
One key function of the exchange should be maintaining orderly markets. So it’s up to the exchange, and the regulator if they want to chip in, to come up with some thresholds for what they consider rational moves in single stock trading. Flash crashes and subsequent cancelling trades on stocks that get crushed or rise too significantly don’t make any exchange look professional.
I think proper controls should be set at the broker level, but it’s more important for the exchanges themselves to also have certain rules that are adhered by all brokers instead of leaving to every brokers own interpretation of what would cause an unorderly market.
Risk Technology Moves Beyond Market Access Rule
By Brian Ross, CEO, FIX Flyer
Pre-trade risk management is not new to the sell-side or exchanges, but the way it is implemented continues to evolve. Since the Market Access Rule 15c3-5 was enacted, the solutions that were implemented were done on a pre-trade level. However, the complexities of the market today and the compliance within the spirit of the rule call for improved technology for both pre-trade and post-trade surveillance.
The Flash Boys book by Michael Lewis and the subsequent debate highlight the fact that investors have lost faith in the integrity of the markets within and beyond the borders of the US. But investors have not given up completely. They want the trading community, made up of themselves, brokers, exchanges, dark pools, and regulators, to offer solutions for keeping them safe.
Institutional investors are concerned with market participants who leverage predatory trading tactics resulting in skewed pricing and returns. Being able to monitor and stop this kind of abusive behavior is important. But managing risk is more than just setting limits and waiting for an alert to be triggered, like a speed bump that lacks awareness of previous patterns of information. An effective trade surveillance platform detects deviations from established trading patterns and possible manipulative trading.
Compliance officers and trade support desks must be empowered with tools to effectively monitor for suspicious activity in near real-time. In today’s high-volume trading environment, trade surveillance tools must scale extremely well in order to be effective under unusual, suspicious trading conditions.
Investors expect that brokers, exchanges and regulators are working together to create a fair market with equal access. And the brokers, exchanges and dark pools need to prove to all investors that they are doing everything they can to provide a comprehensive risk management program. And all of this must be done while the buy-side develops ever more sophisticated technology to capture alpha.
FIX facilitates a holistic view of risk
Up until now, pre-trade risk management has been streamlined for equities. We are moving beyond this one dimensional approach. Today a broker needs to aggregate exposure across multiple asset classes – meaning equities, options, futures, fixed income, commodities, and currencies. Derivative markets face the same fundamental issues as equity markets, such as risk and orderliness; the technology can be extended with a minimum investment. In the US and other developed markets, regulation and oversight of derivative markets is less mature than that of equities markets, making pre-trade risk management and trade surveillance systems a critical concern for brokers, exchanges and regulators. For example, derivative products continue to shift to exchanges, such as certain swaps to Swap Execution Facilities (SEFs), pre-trade and post-trade risk management must be put in place.
FIX makes it possible to get a single view across clients and asset types. Although the characteristics of the various asset types can be very different, the workflow for orders across the different asset types is very much the same. FIX is the lingua franca for trading across asset classes, simplifying risk management across asset classes and facilitating a holistic view of trading activity. A FIX-based platform can provide a centralised view of all of a counterparty’s trading and positions in real-time. This protects the broker from a client exceeding its financing intraday or a client from an excessive margin call. The broker is also able to use this consolidated view to manage other risk and compliance related items, such as corresponding clearing.
FIX is effective at allowing different payloads within the same workflow, dealing with all of the idiosyncrasies at many firms, and allowing for nuances in the data. While that variability can sometimes complicate the normalisation of data, imagine how much worse it would be without FIX.
A big moment in understanding pre-trade risk came when the FPL Americas Risk Management Working Group released the Recommended Risk Control Guidelines in June of 2012. The document targeted brokerage firms in the US that were looking for best practices and a common nomenclature. Since then, we successfully introduced it into emerging and frontier markets and it has accelerated the dialogue around risk for both the brokers and exchanges. In addition, it allowed vendors like FIX Flyer to educate and engage this new community who wanted to be compatible with global standards. This is an excellent example of how the FIX Trading Community has gone beyond data definitions and protocol specifications into real business workflow.
Integration is easier with a properly defined FIX interface. FIX is common all over the world. Implementing pre-trade risk workflow when the local OMS can accept the workflow in FIX makes integration standard. For example, we have integrated the same core logic in Mexico and Turkey and even though each market is different, each with their own regulatory rules, we can provide a workflow, data dictionary and an integration that works. New pre-trade risk DMA gateways along with critical trade surveillance technology let local brokers provide a great experience to the users and lets the emerging and frontier markets plug into the world’s capital market systems.
It takes many to create a trusted global market.
All participants are responsible for risk management and market surveillance because everyone has a stake and everyone shares the same goal.
Trust in markets comes from transparency and mutual alignment of individual interests. For example, an institutional buy-side trader trusts that markets are fair and brokers act in their interests, or they will take their business to someone who does. When discussing pre-trade risk with a large broker, we have been told “If all of my clients were Fidelity and Wellington, then we would not need risk controls. When we attract business from other firms, we need to keep it safe for everyone.”
If participants don’t help ensure markets are effective, the result will be more heavy-handed oversight and regulation. We are in a time where there is a lot of focus on HFT. Failures in electronic trading can be attributed to poor programming; a lack of rigorous testing; or people deliberately programming their systems to gain an unfair advantage. Those in the latter camp are just using the markets as a tool – if that tool is unavailable then they’ll find another. The fact is that the tool is not the problem, but the people are. A single counter-party can wreak havoc with a system malfunction or questionable trading behaviors, causing investors to lose trust in the market.
As we described before, the global capital markets have become complex and interconnected as technology has reduced the barriers to trade in local markets and across global borders. Exchanges and governments in frontier and emerging markets are eager to attract capital. But they are looking to the developed countries to use the same general structure and understanding of risk controls. Thankfully, new technology is being applied to shore up the integrity and performance of the global capital markets.
Exchanges and Data: Maximising Value
With Eva Saidac, Product Management, NASDAQ OMX
How are big data management and data trends affecting exchanges?
We see that the perception of data is substantially changing across all industries at a very rapid pace. There has been an exponential growth of data in the last few years with the proportion of unstructured data growing the most. That’s why firms really want to tap into those wider sources of data; it could be news, it could be video, it could be social media. Simply put it could be all different types of data that you can’t easily structure into a database. But they’re very interesting and valuable.
For exchanges specifically, they have been dealing with high data volumes and high data update frequencies for a very long time. That’s part of the core business of an exchange; even if it is not a particularly large exchange, the volume of data builds up very quickly. Add along all historical data on top of that, and you get huge volumes as well as new complexities with the growth of low latency data.
The exchange industry is naturally managing and making new headway with structured data, but unstructured data can be very interesting from a strategic business perspective. However, technology challenges remain. Investments are needed in order to capture and manage all relevant data, including unstructured data, and that obviously needs to be supported by relevant business cases. The truly interesting thing is when you put all the data, regardless of structure or source, real-time or historical, in the same context, you can start to discover things that you previously couldn’t before. From an internal perspective, business insight and intelligence will support decision making, but it can also be an effective tool to provide superior client support as well as a means to provide value-added offerings. It is a great opportunity for an exchange to utilise all the data that they already have in the best possible way, but also to tap into new opportunities so they can serve their client base with new data related offerings.
Increased focus on data driven by exchanges, regulation and client demand
Obviously one can’t directly compare markets like the US, Europe or Asia etc. too specifically, but factors driving capital markets’ increased data focus include regulation which demands large data to increasingly be made available at any given time. Market participants need to find appropriate solutions and providers to assist with these obligations and exchanges could be the optimal supplier for such services.
Exchanges want to leverage their assets, provide better services and, which is always going to be the best driver; stemming the possibility to create and provide new client services that can create revenue streams on their own. Exchanges could further leverage the fact that they are the main source of much data – including the data sets from listed companies, the instruments’ reference data and corporate actions, as well as the vast amount of market data itself. I think that data receivers value the opportunity to obtain the most critical set of information directly from the source and this can be used by the exchanges to broaden their data related offerings.
Capital market firms overall have to manage the data that stems from their regulatory obligations; buy-side use more data and data elements in their research and investment strategies; and sell-side supplement their trading strategies with more data sets as well as use data to simulate and test them. An additional example is commodity trading which is highly dependent on data derived from sources of non-exchange generated data, with a large portion being unstructured, such as production and consumption projections, weather forecasts, news etc. These are examples of needs that exchanges can find new ways to serve.
How are exchanges utilising structured versus unstructured data?
The focus has traditionally been, and is of course still very much, on structured data in the exchange industry and there are new opportunities in that space alone. One area driving the evolving use of unstructured data we see is the market surveillance function. For instance, in the event of a market alert, the possibility to research the event further by accessing and searching data that was not easily detectable before such as news, emails and social media etc. to further discover connections between the data. This could be highly relevant when automatically scanning for suspicious trading patterns.
To summarise what do you think will happen next within the data space for exchanges?
The big data wave is on its way and the exchange industry is no exception with a clear strategic demand for many different aspects of data. I think if an exchange doesn’t fully leverage the opportunity, someone else will step in and do it. In general exchanges want to provide more services, and I think the reason data is such an important topic right now is the depth and breadth of its potential.
Everything within an exchange is about data, it’s a core fundamental for the marketplace, and it is now a new great opportunity to maximise its value further.
FIX Trading Community’s 12th Annual APAC Trading Summit
“Picking Up The Pieces”
- Burden on the buy-side
- Regulatory shift continues to move towards the buy-side, as does the burden for technological understanding and in some cases, development
- The buy-side trader’s required skillset continues to change, not only to continue to sourcing blocks, but to understand shifting market structure and technology
- Sell-side “doing more with less”
- Changes to how sell-sides are paid – increase in CSAs; increasing demand for high-touch service from low-touch desks; “blended” desks fit some, but not all, and also introduce commission/payment complications
- Regional desks on the sell-side: shifting tech teams and traders to cover more markets from centralized locations. Not necessarily headcount reduction anymore, but being smarter with the headcount placement.
- Exchanges and alternatives continue to innovate, and often create many questions (and opportunities) while doing so:
- Hong Kong-Shanghai Connect program has generated a vast amount of interest across the region; many questions remain about the precise workings of the program, and much remains to be done for many firms to be confident of being ready in time.
- IEX and “the book” generating interest, and creating many questions around the precise nature of market structure that is desirable. While the book itself was not much of a revelation to the industry, it has spurred a much wider conversation which will continue especially in the US and Europe. Common view that there’s a bit of sensationalism at work, and education is key.
- Collaboration between industry participants a key message in virtually every session. New innovations, regulation, and market microstructure changes are too much for any one participant to handle and optimize. Constructive – and often data-backed – dialogue and consultations are increasingly important for the industry.
- “Flash Boys” mention count: 13
Session Notes
- Shanghai-Hong Kong Stock Connect
- Part of the trend of opening up China and cementing Hong Kong as the best place to access China
- Calculation of quotas will be done on a real-time basis, but questions remain over how often that data will be released – the exchanges want to avoid rushes to grab the last of the quotas
- Clarification was given that QFII funds and those bought through the quota will be held in separate accounts, and as such are not fungible. Further clarification given that while technically covered short sells are allowed on both markets, there are still difficulties being worked out regarding how to actually borrow shares in the Mainland.
- Questions remain around settlement, capital gains tax, voting rights, and latency arbitrage opportunities, among others.
- The Dark, the Lit, and the Alternatives
- Video interview with Brad Katsuyama, President and CEO of IEX Group.
- IEX is new, but growing. Looking to address major questions regarding US market structure through their 350 microsecond delay and other innovations and measures.
- Buy-side owned, as opposed to sell-side owned, which gives them a new perspective on market operation and value to the buy-side
- Broker priority on order types is one way of encouraging more crossing on IEX; this can serve as a free facility vs. a broker’s building and maintaining its own dark pools.
- “Good vs. bad HFT” – different types of arbitrage, some to take advantage of market inefficiencies, others technology inefficiencies – which are good and which are bad?
- Examples of negative reversion have sprung up – rather than creating market impact, after the trade, the security reverts in your favor
- Value Propositions and Exchange Models
- The wider market structure debate’s relevance to Asia needs examination. Only two really fragmented markets in Asia, namely Australia and Japan.
- The markets in the region are also relatively slow – latency arbitrage opportunities and other “negative” HFT behavior less likely to occur given fewer “playing grounds”.
- Using the dark – major problem over branding. Whoever first called it “dark liquidity” did nobody any favors. However, the buy-side has plenty of legitimate uses for the venues to trade anonymously and in size.
- Exchange test environment: Best Practices
- Key questions around testing environment design remain – different types of environments and simulations for different purposes. What do firms already have in-house vs. what they need the exchange to provide – how are the two different and what values do they both have?
- Should exchanges provide artificial noise on these environments? Should they just let test algos “bounce off each other”, or should they be full sandboxes? What capabilities are there for market events / shocks to be tested? Privacy of strategy testing? Repeatability of events like closing auctions? Many different ways to skin the cat.
- Resources are fundamentally limited for development
- No point testing just for the sake of testing – the environments need to be “stressful” enough to ensure that systems and products are ready to go live
- Industry-wide, standardized test cases could potentially be used for satisfying regulatory requirements, but “one size doesn’t fit all”. But standardized situations could be used for profiling and collecting meaningful metrics, e.g. order/trade ratios, cancellation rates, etc.
- Consensus from brokers and industry participants are key, and an action point for the industry
- Ch-Ch-Ch-Ch-Changes: Exchanges and regulation in Asia
- Updates from Greg Yanco, ASIC included their new surveillance system, and how it is changing how regulators are conducting themselves – an increasing blend of industry conversation and hard data to back up the proposals. Potential tick size competition in Australia an open question
- Balance of competition and free markets, with too much fragmentation. Clarity is needed around dark pools and who can see what inside them.
- How to pay for regulators Message fees – potential impact on order/trade ratios? Not disruptive on the whole, but market makers have raised concerns.
- Japan update – tick sizes and pilot scheme to shrink tick sizes further, curiously at a time when the US is looking at a pilot to increase tick sizes. Increasing accessibility to retail is an underlying goal. Significant bid/ask spread reduction mentioned as a result in the pilot program.
- Alternatives, e.g. Chi-X, can inspire innovation and changes (e.g. tick sizes) in certain markets; both institutions and retail can potentially benefit from price improvement. But systems must also be upgraded to handle innovation across the spectrum of market participants.
- Singapore update – market structure changes are incoming, but it is similarly a balance between analytics and consultation with the market. Conversations are ongoing around collateral, derivatives trading, and a broad market structure study. Point made that when volumes are low in a market, fragmentation may do more harm than good.
Trading Today’s Markets
With Robert Karofsky, Global Head Equity Trading, AllianceBernstein
There’s never been a better time for people to invest and transact in the market; the market is as efficient as it’s ever been and trading costs are at an all-time low.
We use dark pools significantly. I think dark volume as a percentage of overall volume is close to 35%, and we’re not exceptional in that. There’s less leakage in dark pools as we’re not exposing our orders to abuse. It doesn’t mean that dark pools are evil, it just means that there’s information leakage in the lit and we need to use a combination of all these tools to trade. The best and most sophisticated buy-side trading shops, of which we believe we are one, set nothing in stone.
You start trading and you learn throughout the day. If you don’t find natural liquidity, you trade quietly. Maybe you trade anonymously, you’re trading electronically, you’re probing, you’re exposing your orders to dark venues and sometimes you just get out of the market altogether. The best way to trade without impact is to be unpredictable and anonymous and that’s what we try to do. We try to be unpredictable, we try not to leave our footprints and so that means doing things differently every day.
Do you think more firms are starting to realise just how important the trading is as an integral part of the overall investment process, or do you think there will always be firms adopting a more fundamental approach?
I think that it depends. Some PMs think that they own the alpha: they press the button and they decide where they want to get in and get out. We know it couldn’t be further from the truth. I think that the one benefit from all of the noise around high frequency trading and Michael Lewis’s book, is it raises the subject of trading, and it lets people know how complex it is.
Connecting and navigating all this with terminology in microseconds and microwave technology is daunting; it’s not easy and trading is extraordinarily important. You can look at it two ways: adding alpha or protecting alpha. For me there’s no difference; greater collaboration and coordination between the trading desks and the PM’s and analysts pays enormous dividends. There are a lot of firms that look at it this way and I think those are the most successful firms.
If you don’t know where your orders are being executed and if you’re not monitoring accelerated venues and sell-sides, then you’re behind the times and you are obviously making yourself vulnerable to the predators in the market place. That’s why we have invested in our own quantitative trading team and they are constantly evaluating and monitoring these variables and making us highly effective in the market place.
There are problems in the marketplace though, and I think instead of saying the market’s rigged, it is better to say that there are problems with the market; you have multiple, different economic schemes at different exchanges, and you’ve got multiple order types at various exchanges, which create an environment where some people can have an advantage over others. It’s not always the case but it’s certainly a less than perfect marketplace; there are areas to improve and we should all be striving to reach that. This is not to say the market is rigged; there have always been people that have had advantages versus other folks and that’s never changed.
High frequency trading and electronic market making is just an extension or technological evolution of upstairs market making and specialists. We were at a much greater disadvantage under that regime; trading on a stock exchange with specialists who could see the order book at all times. We had one place to trade and they saw the whole picture and nobody else did, and they had their purpose. They had a lot more information than other people and they were a for-profit organisation. So why can’t electronic market makers also make a profit? The way they make profits is by very quickly sizing up supply and demand and taking advantage of that, and buying low and selling high; that’s the way it’s always been done.
It happens, but it’s always been happening. If I am working an order with Merrill Lynch and Morgan Stanley calls me and says they’re a buyer of the same stock that I’m trying to buy with another firm, I call that firm and get more aggressive. So that happens within 30 seconds but what is the difference? People are using information to their advantage.
It’s not a perfect system but the question is, what makes a perfect system? If you get rid of high frequency trading by doing certain things, then we go from 6 billion shares traded a day to maybe 3 billion. Is that good for people? You can ask is volume liquidity, or is liquidity volume? I don’t think anyone really knows the answer to that, but it’s somewhere in between so we need to be careful. Saying something is evil and needs to be carved out of the system is too much. There are some subtle things that can help level the playing field to all participants, but we don’t want to do something that makes people leave. We have got to be very careful to reiterate, trading costs are lower than they’ve ever been – isn’t that the most important thing?
How much is regulation being reactionary to populist sentiment?
There are things that can always be improved but you have got to be careful. Look at the use of steroids in baseball; the regulator called this into question and now the integrity of the entire sport is in a mess. The attack levied by the governing body almost backfired – they could have just tried to control the situation more subtley. If a system basically works, we need to accept that there are always going to be people that use the current environment to their advantage at the expense of others. This is never going to change.
The irony of the current situation is that everybody, including myself, had to Google what’s faster, a millisecond or a microsecond. Who does this really impact? I do agree that there are predators in the market, and some of the high frequency firms are predators. They arbitrage, they exploit the difference in data feeds between the direct exchange feed and the consolidated tape and those things aren’t good, but I do believe that it is often just one high frequency firm versus another. If you look at profits by high frequency in trading firms, they have significantly decreased over the years. It’s a fraction of what it was.
We have built-in systems that tell us when we’re getting gamed, when fill rates aren’t working; when people are running in front of us, we know this. If high frequency firms are speeding up, one positive consequence of that is we’re getting our signals faster as well. It doesn’t work the other way. If somebody’s scalping you for a penny, it doesn’t matter if they’re scalping you for a penny quickly or slowly.
HKEx Ecosystem Forum: Technology And Market Integrity
The 20th of March saw HKEx host their annual Ecosystem Forum conference. This event brings together the exchange and the industry to talk about the latest developments in technology and services in Hong Kong.
The conference altered its normal format slightly this year, while including a range of presentations and speeches from HKEx senior staff, there were also two panel discussions, which focused on regulation and changing electronic trading patterns in Hong Kong and the wider region.
The event opened with remarks by Jonathan Leung, Senior Vice President and Head of Hosting Services at HKEx, with an update on the latest technology being employed at the exchange. The day was mostly focused around the continuing rollout of the Orion suite of technologies being employed by the exchange.
The Orion Central Gateway is one of these flagship technologies, which was elaborated upon throughout the day, including its support for FIX and precisely how it will function and be rolled out. The other central technology was the Orion Market Data initiative, and its move into derivatives data. These updates are coupled with the new push for the Orion Trading Platform, which HKEx hopes will update the trading platforms used on the exchange with greater functionality and the latest technology.
The first panel of the day focused on changing regulation throughout Hong Kong and Asia, taking a broad view of the drivers of regulation, be it retail, specific market incidents, or just the stability and integrity of the market as a whole. As a result though is the changing answer to the questions of “who is responsible when something goes wrong?” – as we have seen from the SFC’s algorithmic regulation that came into force January 1st, responsibility is being shifted much more definitely towards the buy-side and electronic trading vendors, but there was still a call for the exchange to shoulder burdens as well. The precise balance of what checks sit where is a matter for continued debate, but generally the exchange’s later comments regarding circuit breakers and reinstating a closing auction in Hong Kong were well received as being the right steps in this direction.
On the Orion Market Data platforms specifically the drive is towards the customisation of the exchange’s products to specific segments of the market, breaking out products to meet specific needs of users. Latency is also being improved through faster feeds and a new messaging protocol.
One of the most widely attended sessions during the day was looking at market integrity. HKEx said in the session that they were closely examining the reintroduction of the closing auction in Hong Kong, and looked at the various reasons why a closing auction should be brought back, and the potential form that the auction might take. By way of international comparison, Hong Kong is the only exchange in the 23-member MSCI developed markets list without a closing auction. The exchange is expected to begin consultation in 2014.
IEX: Redefining The Modern Trading Venue
With Ronan Ryan, Chief Strategy Officer, IEX
How are your relationships across the Street developing?
The buy-side is an essential component to ongoing market change. A few years ago the buy-side started asking, “Where are you routing my order?” This has morphed into a demand for more granularity and more control over their orders. By virtue of the fact that the buy-side originated the order, they are owed an explanation of what happens next, and I think IEX is part of the next metamorphosis of this trend. As long as the buy-side continues to realise that they are the key to affecting change in the markets, I think it’s a win-win for everyone.
As we went to roll out the platform, the buy-side helped us get the brokers to connect. So from day one, we had virtually every broker connected to us, which is rare for a venue that’s not owned by the broker community. The buy-side is asking us how best to use IEX; they are the ones helping us with some suggestions. They are working with brokers to configure custom algos to better utilise IEX. So it has just been phenomenal how involved the buy-side has been in this process, and continues to be so.
And the relationship with the sell-side?
It’s been fantastic. We’re buy-side owned but we only have sell-side subscribers. We currently stand at 69 connected brokers with more in the pipeline. Of the dark pools in the US we’re the only one not owned by the brokers, so this sell-side adoption to our venue is critical to us as we didn’t create a Liquidnet whereby the buy-side comes directly to us. The sell-side at the beginning showed some reservations over us being yet another venue to connect to, but now that we’ve shown some legs and we’re printing some volume, the feedback from the brokers is great. In no way is it the intention of IEX to pit the sell-side versus the buy-side. The only way for this to scale and scale for the greater good of the market and both the buy-side and the sell-side, is if we remain neutral.
The sell-side is also setting up meetings for us to meet with their buy-side clients and helping us better configure their systems to interact with IEX. So it’s become very consultative: they are considering our opinion and don’t feel we’re ramming a platform down their throat.
Is there a critical mass for an exchange, and are you there yet?
We’re at a point where we’re breaking even on several days, but I would say there is a “rising tide lifts all boats,” effect. We have a critical mass in terms of the number of brokers and the constituents that are connected to us, but it would be somewhat reckless to start sending vastly more volume to IEX. You need to dip your toe into the water, then dip another toe into the water and slowly build flow. We currently have the biggest brokers connected to us, but they’re not all going to throw a few hundred million shares our way. We recently ended up at around 0.6 of a percent of overall ADV, which is still big for a venue as young as IEX. What is happening is that the reallocation and the amount of orders that we’re getting, (both the size of the orders, how long each order rests and how many orders we’re getting from the brokers) is growing; not always day by day, but week by week for sure. So it’s a matter of proving ourselves, earning more volumes: the ball is firmly in our court. The buy-side is working with the sell-side to create custom algos to maybe oversize IEX a little bit more than they would otherwise.
In our first six months, we’ve matched over 3 billion shares. Considering we are a venue that has no broker ownership, no high frequency firms who own us, it’s pretty remarkable but we haven’t let out a sigh of relief yet. If we end today at this volume, we’re making a little bit of money but we haven’t really affected any change. So we’re very happy with where we are, based on the trajectory of how fast we have got here, but we still have a long way to go.
To what extent are you having to redefine how an exchange measures itself against its peers?
We’re still a dark pool and there’s no standardisation as to how dark pools or exchanges report their statistics, which is a difficulty. So, for example, average trade size can be measured in several different ways; we have IEX statistics and we put out daily stats that are very consistent with what other venues put out because that’s what people expect to get from venues.
We also monitor the standard things; we routinely do a hundred thousand share prints, our biggest print to date being around 428,000 shares. The feedback from the buy-side when they’re on the side of those trades is very positive; they’re absolutely thrilled.
On the back end of things, we’re doing a lot of different toxicity studies and we’re getting vast amounts of day to day data. We’ve ordered a fleet of data servers and we’re starting to look at IEX analytics and how we can best present the flow on our venue. We have only been operating six months so we don’t claim to have all the answers, but we’re doing a huge amount of analytics, we’re taking all the direct feeds, we save every trade, we’re trying to come up with a better way to measure because at the moment, it’s all about volume and there are certain strategies out there that are just manufactured volume. A common question we are asked is: if IEX is supremely successful, aren’t we going to take volume out of the market and will that impact liquidity? The answer is yes, if IEX went mainstream across all US equity markets, it would take volume out of the market, but it wouldn’t impact liquidity. A lot of volume out there is someone buying from one seller to sell to another buyer in a fraction of a millisecond, which is providing liquidity to a certain extent, but it is unnecessary provision, unnecessary mediation.
So, while we are making an impact because we’re disruptive, we need to learn the industry ourselves in our capacity as venue operators before we go out and tell people what those new benchmarks are.
Is there a point at which something needs to change in terms of regulation, or is this something that’s best served by market solutions?
Clearly our belief is in market practitioners and market solutions. The market doesn’t need or want broad stroke regulatory change. However, we are very supportive of the regulators at IEX and they’re supportive of what we’re doing. What I would say is that the SEC has become far more data driven. So instead of using white papers to base opinions on, now they’re taking in a lot of the data. They’re doing a lot of that analytical work too. We’re very happy to see that the regulators are open to looking at the data but they’re also open to market solutions.
On high frequency trading, we find ourselves defending high frequency trading more often than not because it’s another broad stroke definition. In our opinion there are good high frequency strategies, there are some predatory high frequency strategies, there are some that I would call unnecessary high frequency strategies, but there is no single target to shoot at. You have to build a market whereby those unnecessary or those predatory strategies are negated by technology. We are 100% pro-technology. We have many high frequency guys working at IEX. There are 34 of us; probably 75% of us are technologists and we took the approach: what would a predatory strategy need to survive? What would it need for an arb opportunity? How do you referee a market so that those things can’t happen?
We don’t do maker-taker; we don’t have a whole range of order types. But it’s impossible for any industry practitioner to make a call on who is “good”, “bad” or “unnecessary”. It is probably unreasonable to expect regulators to do it on their own.
Neither speed nor technology is the enemy. Even at IEX, here we are taking in and normalising the market data in 260 microseconds. We’re delaying something by 350 microseconds. That doesn’t mean that low latency technology is bad. It’s just a potential loss of utility. The introduction of technology has had a positive impact. Trading is cheaper. You can’t really argue that, but what we’re saying is that the utility could be greater if certain elements were cleaned up.
Alpha Innovation Required: The Inaugural Conference
Peter Waters, Managing Editor, GlobalTrading, attended a groundbreaking summit looking to address the evolving needs of the alpha-seeking buy-side community.
The concepts of big data and social media are generating a lot of interest in the world of financial services. Just how to make use of the vast quantities of data being generated every day, and how to harness the power of new technologies to create new trading ideas is at the forefront of every buy-side firms’ agenda.
With that in mind, Franklin Templeton Investments put together an event between the 25th and 27th February 2014 with a single focus in mind. The “Alpha Innovation Required” summit pulled together more than 20 of the most cutting edge vendors in new digital technologies, with some of the world’s largest asset managers and top tier brokers to discuss just how to move forwards. Pulling back to view technology holistically and productively is just as much of a challenge as the eventual application of the technology.
The event was collated by Franklin Templeton’s Director of Global Trading Strategy Bill Stephenson who spent the preceding months reaching out to colleagues up and down the Street to gather the most exciting names, and to receive countless demos. Spread over two principal days, the first was a time for the innovators to present their ideas and products. Each of the vendors, split into five separate categories, spoke for 10 minutes and took questions on why their idea is worthy of further attention. The second day was a chance to drill into details with specific vendors that each delegate chose to sit down with. The extra-long sessions provided much more detailed demos and analysis of the precise trends that led to the tools being developed.
What data?
With masses of data being created 24 hours a day 7 days a week, just which sources of data a firm chooses from is a slew of decisions in itself. Big data is generally categorized into two streams; structured and unstructured. As one speaker said, structured data is the easiest to use, but the problem is that everyone else has the data. Unstructured is much more challenging, but infinitely more rewarding.
Twitter is one of the largest, and most obvious sources of trading inspiration, but the sheer quantity of data, and judging its quality, is seemingly an insurmountable task. Some of the presenters have developed different ways to manage the volume of data, and to separate out the valuable from the useless. The Icahn Apple tweet is valuable information, a retweet several days later is less so, and a tweet from a fake account can be deliberately misleading.
Another firm takes news from over 22,000 outlets, although not including Twitter, and feeds out to firms breaking news and summaries that tie together disparate strings of information into actionable content. The key is in choosing the right inputs, and constantly striving to ignore the fake signals. Through their system they broke potential investment news hours before it reached the mainstream newswires.
Once the data has been gathered it needs to be cleansed and analyzed in order to have value. And these systems need to be constantly getting better at their jobs. A process called machine learning comes into play, by which the algorithms that pore through the data teach themselves what is most valuable. For example with Twitter, each tweet gets read and re-read through careful examination by algorithms that are constantly learning and improving so that the input can be ranked and made useful – what content other uses respond to most, or key names and accounts associated with market moving events. This process is one of the major driving forces of constant evolution in this space, as the tools themselves become better at their jobs by design, and feed clean outputs to other developers, who translate that information into a form that can be used by human eyes, and straight into algorithmic trading engines.
Visualisation
If there is one tool which can be said to be at the heart of the entire financial services industry it is probably Excel. One theme of the event was how these latest innovations in big data management are trying to move beyond this reliance into more accessible and visual forms of interacting with the data.
With the exponential growth in research reports, trading information, data management, and a whole raft of other information that takes up both time and screen real estate, traders and PMs are always on the lookout for tools that can display information quickly and concisely. By simply using 3D modelling and selecting a color palate that is designed to be appealing to the human eye, huge amounts of data can be much more instinctively controlled.
The simple amalgamation of such data is also a vital changing process. With huge volume of research reports and analysts models moving across the buy-side desk, how can the PM layer their own perceptions on top of the data, and keep easy track of reports that may have arrived into an already crowded inbox weeks or months ago? Automated data management systems that keep tracks of keywords and linkages between records are all now making this possible.
The upshot is that the PMS and traders can enjoy immediate access to a wealth of data in a form that is far more flexible and can be modified to each individual user.