Home Blog Page 468

Buy-Side Technology Comes of Age

Jon Glennie, Global Head of Equities Trading Technology at J.P. Morgan Asset Management

Jon Glennie, Global Head of Equities Trading Technology at J.P. Morgan Asset Management

A leading technologist from one of the world’s largest asset managers notes the impact cutting-edge technology is having on the buy-side trading desk

Briefly discuss your current role and responsibilities at J.P. Morgan Asset Management?

I am the Global Head of Equities Trading Technology and in this role I am responsible for the full end-to-end technology tooling supporting our four trading desks around the world to manage orders, access market liquidity and analyze trading performance.

How has technology changed during your years in the business?

Historically, much of the investment and innovation in the broader financial services industry has been on supporting sell-side trading desks as the markets they support become increasingly electronic. Whilst the technology innovation in the sell-side has not died away, the buy-side has started to realize the benefits to their operations and ultimately their clients by investing in technology and developing more innovative technology solutions to the aid the performance of their trading desks.

From my perspective the biggest mind-set shift the industry over recent years, has been the realization of how technology and great technologists can benefit the large institutional buy-side.  Here at J.P. Morgan Asset Management, we have gone the step further by integrating technology and technologists into the trading function, where we have software engineers sitting side-by-side with traders in all of our trading locations.  This tight knit, one team mentality, helps the engineers better understand the trading function and quickly innovate to develop new solutions that optimize our trading platforms. 

Ultimately, we operate as one trading function with the common objective of achieving the best trading outcomes for our clients.

What are the main innovation trends shaping institutional equity trading currently?

I see two common themes in buy-side trading technology – understanding how to leverage and harness increasing volumes of data to make more informed execution choices, and how to streamline the trading workflow. 

At J.P. Morgan Asset Management we have been driving innovation in both spaces.  For our low touch flow that is typically algorithmically traded, we have been increasingly leveraging machine learning techniques and applying them to our past execution data.  This process allows us to provide a recommendation on which broker and execution strategy to trade a particular order with.  Critically, with the sell-side evolving their execution algorithms all the time, our in-house built model continuously learns from traders and trading data inputs to refine this recommendation.

In certain markets, the smaller more routine orders are fully automated, which allows traders to add value to the larger and less routine flow hitting the desk.  For larger orders, which may follow multiple execution strategies, data and analytics are able to provide insight into where liquidity might be best sought.

Additionally, industry-wide I can see consolidation onto fewer trading technologies to help ensure a globally consistent and streamlined workflow in trading organizations.  For us at J.P. Morgan Asset Management we have moved all our flow to Spectrum Trading, our in-house-built unified OMS, Trading Automation and Analytics platform.  This shift has provided traders with the tools to manage, monitor and analyze the $1.3tn of order flow that we trade annually in a single integrated platform.

What are the associated challenges/opportunities for the buy-side?

As the buy-side embraces technology innovation more, there is the ongoing challenge associated with attracting the very best technologists to our industry.  Historically the buy-side technology teams have implemented vendor platforms or worked using now legacy technologies, which may have hampered recruiting efforts in the past.

I moved to the buy-side relatively late in my career and now three years into the journey I genuinely believe some of the most innovative and transformational technology initiates in financial services are now on the buy-side and in revolutionizing how the buy-side trading desk operates.

What are the focus areas for buy-side resource allocation, in 2019 and looking ahead to 2020?

For the coming years I believe there will be a continued focus on how to harness technology to both optimize the trader experience and automate the more routine flow to drive efficiencies on the trading desk. Data and analytics will play a critical role in achieving this.

In order to move with agility across these large datasets I can see firms looking increasingly leverage public cloud technology to be able to cheaply spin-up data research environments.  This does present its own challenges around securing the public cloud for proprietary datasets.  At J.P. Morgan firm-wide we are working with major public cloud companies to create secure environments in the cloud that meet our security standards at the same time as yielding the benefits of cloud technology.

Additionally, I can see traders playing an increasing role in providing input into the investment process, highlighting more trading opportunities and market insights to the portfolio managers.  By implementing smarter and more integrated technologies traders will be able to disseminate market insights more efficiently to interested investors that could result in investor placing an order they would otherwise not have sent.

How are new data and innovation tools enabling buy-side trading desks to boost efficiency?

Leveraging data and increased technology innovation has become essential for any buy-side desk to remain competitive.  Clients are increasingly interested in the costs of implementation of investment decisions and so having the data to evidence best execution first off is a must.  Being able to leverage this data intelligently and provide recommendations on how to trade most effectively will progressively become a feature of the buy-side desk.

In making the desk more efficient, we will also enable the frictionless launch of new products without the need to scale up the trading desk to handle increased flow.

What are the key areas of focus in equities trading technology at J.P. Morgan Asset Management, including recent accomplishments and current/future initiatives?

Our major focus has been on taking a tech start-up mentality to how we build, test and deploy our technology solutions. This mindset is reflected in the fact that we now release our software at least once per day (typically 450 per year), adding incremental benefit to the trading tools available to the traders and moving away from “big bang” releases. 

Only through our heavily automated process, executing in excess of 29,000 tests prior to every release, have we been able to ensure the quality of the software released at this rapid rate. Not only does this frequent software release rhythm reduce risk, but it has also encouraged traders to provide ideas to further innovate across the platform.

One such example of this spirit of innovation is BETSI, our Symphony Bot that integrates the Symphony messaging platform into Spectrum Trading.  Through this platform traders can communicate market and stock specific news to investors around the globe, allowing us to streamline the communications between the desk and sell-side.   Using automated chat-bots we can now simplify RFQ to the sell-side as well as identify matching IOIs from brokers all within the single UI, enhancing the user experience while minimizing risk mistakes in rekeying data from chat screens and emails.

The launch of a tool such as BETSI demonstrates the benefits of the tight knit relationship between traders and technologists. This, coupled with the support of a platform that can release multiple times per-day, can deliver new innovative solutions to the trading desk from Proof of Concept to a production operating system in a matter of weeks.

Regulatory Scrutiny Of Buy Side Will Not Abate

eloitte highlighted that asset managers this year will face more regulatory scrutiny of their holdings of higher risk assets, implementation of technological change and should continue to modernize their regulatory, legal, and compliance functions.

The consultancy’s 2020 Investment Management Regulatory Outlook said: “Although the post-crisis wave of regulatory change is subsiding, there is much to attract regulatory and supervisory attention in 2020, and firms should not expect scrutiny to abate.”

Deloitte 2020 Investment Management Regulatory Outlook

As a result, Deloitte added that investment managers should continue to modernize and rationalize their regulatory, legal, and compliance functions and their practices.

“Investment management companies that take a holistic view of regulatory risk management may find efficiencies that lead to streamlined and rationalized programs,” said the study. “Some companies are even looking at their regulatory and compliance risk management programs as a competitive differentiator that allows them to be more nimble in the market place.”

In order to correctly identify risks asset managers should use analytics to review ever-increasing volume of data now being collected.

The report continued that asset managers likely find it hard to perform well in the current environment with high asset prices and low growth potential. As a result, active managers are likely to make more investment in exotic and less liquid markets to boost their returns.

“We expect supervisors to focus increasingly on how investment managers and distributors satisfy themselves that funds holding higher risk assets meet the needs and risk appetite of their target market,” added Deloitte.

Cybersecurity

Regulators are also concerned about ineffective implementation of technological change as there is growing evidence that it can increase cyber and operational risk.

“International standard-setters will likely try to establish baseline common approaches for operational resilience, but we expect progress on cyber-resilience to be made mostly at the G7 and European levels,” said Deloitte.

Cybersecurity issues also continue to grow. Deloitte said that its recent industry survey, in collaboration with FS-ISAC, found that financial services institutions spend 10.1% of their IT budget on average on cybersecurity. However, the number of data breaches in the first six months of last year increased by 54% over the same period in 2018.

Brexit

Brown Brothers Harriman noted in its On the Regs blog that the UK is due to leave the European Union at the end of this month. “Most asset managers have had contingency plans in place for several months now and have notified regulators of their plans,” added the blog.

Sean Tuffy, head of regulatory intelligence, custody & fund services at Citi, said in a report last month that there is an outstanding financial regulation issue related to Brexit that has not garnered much attention

The What Does it Mean for FinReg? report explained that a new Financial Services Bill was meant to codify EU laws into the UK statute book but was delayed due to the election, so a new version of the bill needs to be passed before 31 January.

Sean Tuffy, Citi
Sean Tuffy, Citi

Tuffy said that if a new bill is not passed and there is a no-deal Brexit, then the UK will not have equivalent rules and likely lose even limited access to the EU.

“It also would raise practical issues around cross-border regulatory reporting, such as derivatives transaction reporting,” he added. “So far both sides have shown a great willingness to avoid a hard Brexit but, the industry would sleep better knowing that a contingency plan is in place.”

BBH also noted that that the UK’s Senior Managers and Certification Regime took effect for asset managers on December 9 last year. The regime holds firm leaders to a high professional standard, increases accountability on individuals and has extraterritorial applicability.

In addition, the BBH blog highlighted that the fifth EU anti-money laundering directive takes full effect this week as member states were given until January 10 to put enabling legislation in place and beneficial owners will have to register in each country’s registry.

Q&A: Tyler Moeller, Broadway Technology

Shhh. Can you keep a secret?

Banks are known for being secretive with their trading technology, opting to do a lot of the building themselves. What would a bank be without its “secret sauce” anyway?

But as the cost of technology becomes exorbitant and banks need to redeploy capital to other parts of the firm, some are learning to let go and farm out their technology needs.

Is it a budget thing? Or is third-party technology that good? According to Tyler Moeller of Broadway Technology, it has a lot to do with the fast-moving nature of markets and trading profitability. Moeller spoke with Traders Magazine Editor John D’Antona Jr. and discussed the state of technology development within and outside the banking community and what exactly is going on here.

TRADERS MAGAZINE: Why are banks finally turning to vendors to build for them rather than in-house?

Tyler Moeller: Disruption in the capital markets have exposed the limitations of building a complete technology architecture in-house. The costs are now too great, regulatory and market change is happening too quickly and the technological sophistication of clients and competitors is skyrocketing.

Tyler Moeller, Broadway Technology

Utilizing vendors has always made good business sense. The vibrant, competitive fintech market naturally pushes companies to develop innovative solutions and drive costs down. In contrast, building in-house can be a risky choice, lacking the accountability that a more open, competitive third-party marketplace provides.

Banks now realize all this and are rethinking the role that technology plays in gaining a competitive advantage. This introspection has produced an epiphany of sorts: core trading architecture is not where they should focus their efforts. Instead, the true edge can come from enhancing algos and analytics, and developing innovative services for customers.

Banks now understand that if they can outsource many of the foundational aspects of a trading architecture to a trusted technology partner – such as an enterprise platform for application and data integration, order routing and management, algo support, e-commerce functionality and connectivity – they can significantly reduce the cost and resources required  to support their trading architecture. These resources can then be dedicated towards the areas where their edge is best realized, including innovation and revenue generation, which will in turn produce better results and bring new ideas to market faster. 

TM: Is it a budget thing? Or is third-party technology that good? Or are the banks tapped out on resources?

Moeller: Budgets play an important part. The economics of building trading platforms in-house are no longer viable. It’s estimated that 75% of a bank’s IT budget now goes towards maintaining legacy software. These large-scale investments are unable to provide the returns they once did, given that FICC trading margins are slimmer.

But above all, banks are increasingly turning to tech firms because they know that they will struggle to compete with them in the long run. The pace of innovation is too great. Banks need the flexibility to move into new market areas, like complex package swaps trading or electronic repo trading, and the old infrastructures – whether they’re completely built in-house or some amalgamation of inadequate vendors and custom-built solutions – aren’t nimble enough.

The right technology partners will empower, not limit, banks. Both parties work collaboratively to streamline trading workflows in a manner that stokes revenue growth. To illustrate this – we integrate order management, order flow internalization and customer management workflows for our banking customers. What does this do? It gives traders and business leaders deeper insights on client flows, inventories and performance to improve client pricing and enhance the relationship. This increases efficiency, performance and profitability within the trading operation.

TM: Has adopting third party solutions gotten easier now than say, three or five years ago?

Moeller: With the right third party solutions, yes. The key to adoption is a strong open enterprise application integration and data platform that allows legacy, custom and third party systems to be easily integrated, deployed and brought online, and adapted over time.

Even today, many of the third-party solutions built for banks are designed for specific purposes and limited in their ability to integrate into trading workflows. These products work fine at the start, but can be difficult to adapt as the market shifts.

Strong enterprise platforms are playing an important role in enabling third-party technology adoption. The right platform allows banks and their partners to easily integrate, adapt and expand all aspects of their trading architecture. It also promotes the open data architecture necessary to extract valuable insights from the various, fragmented information sets that banks already possess.

Technology partners must also promote flexibility and collaboration as a platform isn’t enough if vendors insist on pushing rigid solutions onto banks. The right partner with the right technology will help banks realize the strengths of a single, globally-integrated trading infrastructure: flexibility, near-unlimited scale, accessible data and full interoperability across systems.

TM: Has the industry moved to a uniform or simple “plug and play” model of trading technology rather than self-made custom systems?

Moeller: It’s a mix of the two. The industry hasn’t moved to a complete ‘plug and play’ model because, in practice, it’s too inflexible to adapt to what a bank will need down the road. Instead, we’re seeing the advent of a hybrid ‘build and buy’ approach where banks outsource much of their trading architecture and their enterprise platform to a technology provider, while maintaining the ability to build their own applications on top, and focusing on those areas where they can create an edge over their competitors.

TM: What technologies have the banks opted to keep internal? Or are they “all in” on third party involvement?

Moeller: Banks will still keep some of their trading technology development in-house, especially around those areas where they’ll be ablee to differentiate themselves from competitors.

For example, super-regional banks in the Nordics won’t worry about having 24/7 coverage for US government bonds or Asian stocks. Instead, they’ll entrust trading platforms and partners to handle those workflows and focus their internal energy on face-to-face relationships with Nordic corporates, build analytics that enable them to better identify and anticipate client needs and perhaps build algorithms that best suit their needs.

The combination of in-house expertise and flexible, configurable third-party technology will be advantageous. Banks will have the ability to develop custom capabilities without the cost and resources associated with in-house projects.

STA’s Toes Weighs in on Changes to Reg NMS

To Reg NMS or not to Reg NMS – that is the question.

Or how about a new Reg NMS?

The debate on this, taken on by the Securities and Exchange Commission, looks to address the myriad market structure changes that have developed since Reg NMS was adopted in June of 2005. By definition, Reg NMS is a set of rules passed by the SEC which attempted to improve the U.S. exchanges through improved fairness in price execution as well as improve the displaying of quotes and amount and access to market data.

Jim Toes, STA

But in a market that measures execution in microseconds and mils, some are arguing the 15 year-old rule might need updating or scrapping altogether. One point of contention is the Order Protection Rule that mandates stocks must be traded on exchanges that show the best-quoted prices. One of the criticisms is that the rule gives an advantage to high-speed traders. There is also a perception that the rule makes the market more expensive. Pension funds and other institutions can also find it more difficult to execute trades.

To that end, today the Securities and Exchange Commission is meeting to discuss whether to solicit public comment for a proposed order that “would require the SROs to propose a single, new NMS plan that would increase transparency and address inefficiencies, conflicts of interest and other issues presented by the current governance structure of the three NMS plans that govern the public dissemination of real-time, consolidated equity market data for NMS stocks.”

Jim Toes, President and Chief Executive Officer at the STA has weighed in on the subject and published an article on the topic Tuesday and shared it with Traders Magazine. What follows is a direct transcript of the article.

The SEC’s Meeting on Market Data

“Wednesday, Jan. 8, the Securities and Exchange Commission will consider whether to issue for public comment a proposed order that “would require the SROs to propose a single, new NMS plan that would increase transparency and address inefficiencies, conflicts of interest and other issues presented by the current governance structure of the three NMS plans that govern the public dissemination of real-time, consolidated equity market data for NMS stocks.” Details on the proposal are not yet known; however, there is already public debate as to whether another, albeit new, NMS plan is the best path towards resolving the highly contentious issues associated with the quality and cost of market data. While the Commissioners have not yet voted, there is speculation that some will oppose this strategy and express a preferred course of action that involves direct SEC rule-making.

This is not the first time the Commission has had to make a decision on which process to follow for implementing new rules. Unfortunately for the industry, our ability to provide comments on such a decision is limited; that is, we do not have enough information nor a set of criteria needed to make an educated recommendation. This is why we believe that opinions that strongly favor one approach over the other at this early stage are premature. It’s the equivalent to saying a hammer is better than a screwdriver for putting two pieces of wood together without knowing if you have a nail or a screw to complete the task.

Rulemaking and NMS Plans

Rules governing the securities markets today are introduced to the marketplace by either SEC initiatives in the form of direct SEC rule proposals, or rule filings of the SROs either independently or under the direction of the SEC and executed via an NMS plan. While there are similarities in these processes, they are distinct and possess different challenges and efficiencies. The conflicted nature in the governance of NMS plans – specifically concerns that the for-profit entities responsible for designing, implementing and maintaining them will act in their own best interests to the detriment of other market participants, are bona fide. On the other hand, while SEC direct rule making carries fewer conflicts of interest, the amounts of data required for this process has overwhelmed industry participants to the point where the efficiency of this approach has its doubters as well. Results from these two processes over the past decade have been mixed and therefore inconclusive that one is better than the other in any or all circumstances. Both have their flaws and benefits.

In June 2012, STA testified before a House Financial Services Subcommittee on the two processes available to the Commission for introducing rules into the marketplace. We acknowledged then, as we are doing today, that there are efficiencies within both processes which when applied properly can serve the competitive nature of our markets and investor confidence.

However, we also stated, “Our concerns reside in the lack of criteria that are used in deciding which process better serves investor confidence when rules are proposed.”

Even in situations where the Commission is in complete agreement on the goals of a piece of regulation, there is no clear criterion for the best process to introduce it. This void, which remains today, coupled with a lack of details on the Commission’s new proposal, make it extremely difficult for the industry to provide meaningful input.

Neither process is perfect, but both can produce meaningful improvements if executed with transparency into the process and with industry input. We must be confident that both options were considered by the Commission and acknowledge that compromises were likely made to arrive at solutions that can garner sufficient votes to pass.

Managing or Eliminating Conflicts of Interest

Concerns about conflicts of interest arise repeatedly in financial markets. Often they are met with calls to eliminate or mitigate the source of conflict to reduce any potential harm. Generally speaking, STA has always taken a more balanced approach in the presence of conflicts of interest, choosing to find the means to manage them through better transparency and disclosures brought about by competitive or regulatory catalysts. In September 2017, when STA expressed our increasing concern with the Reg NMS Plan process and its impact on investors, including the timeliness and efficacy of such plans, we chose to recommend reforms as opposed to eliminating all or any single plan. We wrote:

“STA believes conflicts between SRO and non-SRO participants, who are essential for a successful NMS Plan outcome, play a role in the shortfalls of these Plans. We believe reforms are needed that address these conflicts, and that such reforms should be guided by improving investors’ interests.”

In our conversations with regulators, one opinion we often get challenged on is that investors can be better off dealing in a regime with conflicted parties when the conflict-free alternative is inferior. This is not to say that conflicts should be overlooked, let alone encouraged; rather, we recommend that regulators consider whether the regime created by a conflict-free process is superior or inferior to one where conflicts exists. Banning a process because conflicts — even meaningful ones —exist, is not a sound practice.

Therefore, while we recognize the inherent conflicts in NMS plans and that the stakes involving market data are high, we are hesitant to dismiss an NMS plan solution at this point without knowing what the Commission is charging the SROs to deliver in its newest proposal.

A Good Step

In the absence details on the Commission’s new proposal and the lack of criteria for making a decision on which course of action to take, the question then begs, what does our industry do today and in the immediate days following tomorrow’s (Wednesday’s) decision? I believe our best response is to respect whatever decision that the Commission makes regarding process and then provide the much needed input to best ensure it achieves its intended goals. Neither process is perfect, but both can produce meaningful improvements if executed with transparency into the process and with industry input. We must be confident that both options were considered by the Commission and acknowledge that compromises were likely made to arrive at solutions that can garner sufficient votes to pass. We need to be satisfied that some progress in the public dissemination of real-time, consolidated equity market data is better than no progress at all and that we should not allow the perfect to be the enemy of the good.

We look forward to tomorrow’s decision and providing the Commission with industry input on how to move forward.”

The IEX D-Limit Proposal: It’s Good…But What If It’s TOO Good?

IEX recently proposed a new order type, the D-Limit order, that would apply its Crumbling Quote Indicator (CQI) to lit orders. The CQI uses a sophisticated empirical model that predicts when prices are about to change, effectively replicating the approach used by high frequency market makers and some broker algorithms to avoid being “picked off”. Specifically, as with their hidden D-peg order, the D-limit order would be re-priced to a less aggressive price whenever the CQI signals a deteriorating quote.[1] . Adjusting limit prices in the face of imminent adverse price changes helps to reduce adverse selection and enhance limit order performance. Indeed, IEX provides some compelling statistics to support the effectiveness of CQI, evidence which strongly supports the claim the CQI can be a useful tool in managing adverse selection.

On its face, the D-Limit could add significant value by providing the broader trading community a means to compete with the more sophisticated, high-frequency traders when posting limit orders. But, as discussed below, some unanswered questions exist surrounding how D-Limit varies across stocks, how it would perform in volatile markets – especially in times of market stress –and what impact a broad adoption of these types of tools across protected venues would have on the market as a whole.

Da Pros of D-Limit

The big draw of the D-limit is that it provides limit order traders access to similar tools used by market makers, helping to level the playing field. Indeed, one of the key drivers of limit order performance is management of adverse selection, so tools that help mitigate adverse selection should provide meaningful improvements to performance. But one additional aspect of D-Limit that makes it especially helpful is that the D-Limit is done at the exchange level natively, whereas HFTs and other low-latency traders must have their signals reside on their own servers. Further, these other traders must pass through the famous IEX speedbump, whereas the D-Limit responds without delay. The net effect is a sophisticated pricing engine with a speed advantage that is available to all limit order traders.

With this in mind, one can’t help but compare this mechanism to other proposed changes, such as the EDGA asymmetric speedbump. The IEX D-Limit is like an asymmetric speedbump since it benefits liquidity providers at the expense of liquidity takers. The key difference is that exploiting the benefits of EDGA requires the traders to proactively reprice their own orders in a very short time horizon (milliseconds), a benefit that is generally not useful to many traders. The benefits of the IEX D-Limit is available to all traders, without the need for low latency machinery. And as we noted in our discussion of the EDGA asymmetric speed bump, reducing adverse selection risk may actually lead to greater liquidity offered.[2]

But questions remain…

The biggest question to me is what impact this will have on IEX quotes of individual stocks. The IEX quotes that the CQI is “on” less than 1% of the day, while 24% of the trades occur when it is on. The IEX also notes that much of the trading that occurs when the CQI is on is done by just a few firms. While these stats suggest that CQI is a targeted means of improving limit order execution quality, these stats provide only an aggregate view of CQI. Use of such high-level, aggregated numbers is generally not overly useful in a marketplace with such broad variation across stocks. The U.S. equity market is characterized by highly liquid and highly illiquid stocks (and everything in between), stocks whose prices rarely move during the day with those whose prices change several times within the same second. Basing decisions on such aggregate numbers is like choosing which jacket to wear today by looking at the average temperature of Earth!

I suspect most market participants and the regulators will want to see more details to better understand how specific groups of stocks – or even individual stocks – would be impacted by this proposal. Nasdaq, for example, put out a remarkably detailed analysis of its Intelligent Ticks proposal with sufficient details that commenters like yours truly can sit in a corner and throw rocks. But jokes aside, having such detailed information helps inform the debate and is critical in helping market participants and regulators appreciate the impact of proposals before any decision is made.

The more detailed analysis should also include how D-Limit, and the IEX quote in general, would behave in periods of high volatility. Presumably, volatility is a key factor determining whether CQI is on or off. Clearly, then, putting D-limit in the context of different volatility regimes is key since the statistics surrounding how often CQI is on, for example, would clearly be a function of the volatility regime. Of course, one could argue that HFTs and others utilize these tools already, including in times of stress. But as the D-limit orders would be part of IEX’s quote, which in turn is a protected quote disseminated in the SIP feed, it is important to understand its behavior across securities and in different market environments.

And stepping back, the IEX proposal raises some potential broader concerns about market quality under broad adoption of these tools. The fact that IEX’s D-Limit is available to all and its CQI logic is applied uniformly to all (relevantly priced) D-Limit orders increases the correlation of liquidity provision across traders. Specifically, since IEX would update all the affected D-Limit orders in unison, the IEX quote itself becomes more “fluid” as market prices change. Consider the extreme case where D-Limit is so wildly successful that all IEX traders use D-Limits. In such a scenario, the entire IEX quote would move once the CQI was on. And if other markets were to adopt similar order types, which also become popular trading tools, a huge chunk of the consolidated quote may fade at the same time, potentially when market liquidity is needed most. While I may be getting ahead of myself with excitement about this order tool, such issues need to be considered since, if the SEC approves the D-Limit order on IEX, presumably they would have to also approve identical (or sufficiently similar) order types.

Conclusion

The D-Limit proposal is one of the most interesting and thought-provoking proposals to come along in some time, perhaps since the introduction of their IEX speed bump. While its introduction does raise some questions, the fact that IEX is pursuing tools aimed at enhancing liquidity and improving performance by leveling the playing field a bit is quite encouraging. Definitely a good way to start the New Year! [3]

Jeff Bacidore is the Founder and President of The Bacidore Group, LLC.

ING Makes AI Bond-Trading Tool Independent

ING is separating Katana and new funding will expand the artificial intelligence tool which was developed by the Dutch bank to helps investors make quicker decisions on which bonds to buy and sell.

The Dutch bank originally developed Katana to help its traders provide liquidity and respond more quickly to requests for quotes from clients. In 2018 ING launched Katana Lens for the buyside after developing the offering with PGGM, one of the largest pension fund managers in the Netherlands. ING said other asset managers have also been using the tool and providing feedback and market validation.

Santiago Braje, Katana Labs

Santiago Braje, former head of credit trading at ING and chief executive of Katana, told Markets Media: “Katana will be independent of ING’s trading activity which will avoid conflicts of interest and we will be able to access data across the whole market.”

Katana Lens systematically scans a universe of investable bonds chosen by the fund manager and identifies possible opportunities based on the current yield or spread against historical behaviour for similar underlying risks over the past six to 12 months. Clients can see profitable opportunities they might have otherwise missed, make decisions more quickly and trade smaller tickets more frequently.

Katana Labs ltd has incorporated in the UK as part of the spin off with ING Ventures investing a further £1.5m ($1.9m) alongside other investors, as part of a £3m funding round.

Katana presents trading ideas but is not linked to execution.

“Our roadmap includes linking to execution so the so buy side has an integrated solution, and adding liquidity intelligence features,” said Braje.

He continued that the new funding will also be used to expand the firm’s team and its bond coverage. Katana currently focuses on emerging markets, global credit and European investment grade bonds.

Braje added: “In a year’s time we aim to have expanded product coverage, liquidity intelligence and and linked to execution. We should also be ready to start expanding globally.”

Fixed income automation

Meaningful automation is coming to fixed income according to the Top 10 Market Structure Trends for 2020 from Greenwich Associates.

The consultancy said absolute volume traded through electronic channels across request-for-quote, order book and streaming increased by over $2bn (€1.8bn) a day in US corporate bond markets last year and investors began widely adopting automated trading tools.

“Furthermore, electronification of processes on the trading desk other than the execution itself—salespeople deciding whom to call, more efficient capital allocation, etc.— will happen more quickly and have a greater impact on the market’s functioning going forward,” added Greenwich.

ING Labs

Katana is one of the 25 different innovation initiatives currently in one of the ING Labs in Amsterdam, London and Singapore. It is the latest start-up originated by ING that enters the scale-up phase, following other projects such as Yolt,  a mobile app to help people keep track of their finances, and Cobase, which allows multinationals access their accounts across banks and financial institutions.

Annerie Vreugdenhil, ING

Annerie Vreugdenhil, chief innovation officer at ING Wholesale Banking, said in a statement that Katana has grown from an internal innovation project to a serious value proposition for bond investors.

“We attracted major clients who see the added value of this super smart AI-tool,” she added. “I’m proud that with our support Katana grew out to a fully-grown fintech that is ready for an independent future. A future that I’m sure will be very successful.”

FX trading focus : Muamar Behnam : Swissquote

EXPECT THE UNEXPECTED.

Muamar Behnam, Head of Sales HQ at Swissquote Bank, shares his thoughts about the state of FX trading in 2019 and looks forward to the start of the new decade.

At the beginning of 2019, a survey by JP Morgan said that liquidity, or ‘the ability to buy and sell currencies whenever needed with minimal market impact’, was seen by currency traders as the biggest challenge for the year ahead. Was that, in your opinion accurate, and will that also be the case for 2020 and if so why?

Definitely. Overall 2019 has been an average to good year in terms of volume, especially considering the periods of quite low volatility we faced. We, and I imagine quite a few other market participants, faced some tricky moments during the year. January 2nd and the flash crash on the Swiss Franc (CHF) springs to mind, when prices went wild and few liquidity providers (LPs) were still in. Thankfully, we could rely on some very strong LPs during these events who provided us, and our clients, with the requested volumes. 2020 will be no different in my opinion. Even if volatility remains average to low, we will still face these moments where the market will be impacted by huge volumes being sent simultaneously, and prices will be hundreds of pips away from the requested ones.

What impact have geopolitical tensions had on the liquidity landscape?

If we accept that political uncertainties have reduced market volatility, and have thus impacted the volumes in H1 2019, the effect of geopolitical tensions has in my opinion, remained relatively limited. The trade war between the US and China went on all year, as have the endless Brexit discussions, and even the attack on the Saudi Oil facilities have had limited impact. These events, like others of the same kind, have an immediate impact on the market but I’m not sure that any of our liquidity providers were unable to deliver during these times. Geopolitical tensions create trading volume but they don’t change the fundamentals of the liquidity landscape.

What are the technological changes that have affected behaviour and the full trade cycle?

Algo trading is making monetisation of the FX trading flows more and more difficult. As with many FX brokers, we sit in the middle of a game where on either side we deal with market participants who are becoming more and more technologically prepared to make the most of every tiny market move. Sharp algos and trading pattern detectors on the traders’ side, and smart instruments, monitoring tools, last look and speed bump features on the LPs’ side make it a more and more competitive game. It’s a given that we must always do our best, left and right, so that all participants are happy at the end, which makes it for me a fascinating game to watch. We have definitely entered the era of ‘Champions League’ FX trading!

What are the different liquidity needs of the various institutional players – asset managers, hedge funds, pension funds, etc?

The needs are as a various as there are participants. Some focus on specific instruments where high volatility allows them to keep shorter-term positions, some are still ‘news’ traders so we must be able to guarantee liquidity during those times. Others (usually larger ones) just want to make sure that we are always available to offer them the depth they need in case they need to cover their exposure. But, we’ve also noticed a trend among the largest partners that positions are kept much longer than they were in the past, and all these clients use just a fraction of the available leverage. The demand on ‘Full Amount’ has definitely increased. Both makers and takers understand that trading larger tickets against one counterparty helps in reducing the market impact, as well as improving spreads, since a provider is not required to hedge quicker.

What is the level of fragmentation in the market and how are market participants such as Swissquote Bank addressing this issue?

I believe it has never been as fragmented as it is right now. We have retail clients trading like institutional ones; the level of knowledge and the flow they send is sometimes more challenging than the flow we get from institutional partners. Technology has empowered everyone. Money managers have strategies that are incredibly well-designed. The spectrum of clients is getting always larger. The institutions are always more demanding, but every demand is more specific, down to the finest detail, and we always try to come up with an answer to their requests. At Swissquote, we have fine-tuned our liquidity pools to serve every type of client in the best way possible. Our institutional partners have access to dedicated liquidity pools because that is the only way to match their demands, the “one-size-fits-all” approach is no longer viable. Customisation of our systems, of our liquidity pools, of our platforms, and of our risk management systems is a must-have now, and we’ve worked in that direction these last two to three years.

What role is data playing and how can it be better harnessed?

It is extremely challenging to improve what you are not measuring – the industry cannot go blind anymore. Analytics are key for providers to understand the quality of flow and where and/or how to route it. We are just at the beginning of exploiting this incredible resource that are the terabytes of data we collect every day. It allows us to enhance the service we provide by understanding how the clients trade, what are their needs, when do they trade, what do they avoid, and what triggers trading in a low volatility environment, etc…

What are your strategic priorities for the year ahead?

We will continue to develop our offering by connecting to more electronic communications networks (ECNs), diversifying our institutional liquidity pools, and by bringing in new actors who have a different risk appetite. Our institutional liquidity pools have improved a lot these past two years, but we want to push to the next level in 2020. The ECNs are the marketplaces for institutions and we must be present, not only on all the major ones but also on the new and innovative ones. Our specialists are continuously screening the newcomers in this segment and looking for opportunities. But beyond that, 2020 will definitely be an exciting year, with the upcoming US elections and the aftermath of the UK General Election and Brexit. As much as you can prepare a strategy, defining this and that parameter, we must above all be ready to adapt to any kind of situation should the unexpected occur… again.

©BestExecution 2020

[divider_to_top]

Market Impact in Open & Close Auctions

By Kathryn Zhao, Global Head of Electronic Trading, Cantor Fitzgerald, Liang Liu, Head of Electronic Trading Quant, Cantor Fitzgerald, Haixiao Cai, Senior Electronic Trading Quant,Cantor Fitzgerald, Yan Yu, Electronic Trading Quant, Cantor Fitzgerald

The deep liquidity at equity markets’ open and close auctions typically provides price discovery and facilitation of block order executions.  Trading at these centralized, large-scale liquidity events permits institutional investors to establish sizeable positions without undue complexity [1]. To participate effectively in auctions, we need to understand how an order can potentially affect the auction price, namely the market impact. While market participants have a relatively good understanding of the market impact in continuous trading sessions, there is much less knowledge about auction market impact in the public domain.

Kathryn Zhao, Global Head of Electronic Trading, Cantor Fitzgerald

The conventional method to calibrate a market impact model is to run regression analysis on the price change before and after the order against the order size using proprietary order data accumulated over many years. However, applying this method to estimate auction market impact has two major drawbacks:

There is only a single open and a single close auction each day, so a large data set takes a very long time to accumulate. Moreover, the auction price used in such regressions reflects only the final equilibrium price and not the price dynamics to reach the equilibrium, which is often used in the calibration of a continuous market impact model. 

The regression using a given firm’s trading data is often noisy because the firm’s orders only represent a small percentage of the total number of orders entering the auction.

In order to address these shortcomings, the next natural source of data to utilize is the auction imbalance messages published every second from the various exchanges (after 09:28 for open auctions and after 15:50 for close auctions on NSDQ & NYSE).  Using these imbalance messages, we obtain aggregated interest for an auction from all market participants (NYSE imbalance messages include d-Quotes at 15:55) and have many more data points for each symbol and day. The time series of the imbalance quantity and the near indicative clearing price offer a rich data set to analyze the market impact for the auction. Our goal is to find out how much the auction price will be “pushed” by additional imbalance quantities on average and then use that as a proxy for the market impact for the auction.     

Methodology

Each second the exchange publishes imbalance messages, which are “snapshots” of the auction book, including paired quantity, imbalance quantity, reference price and near indicative clearing price [2, 3]. Near indicative clearing price indicates a tentative auction price where the current auction book and the continuous book would clear against each other. As additional orders are added to the auction book, the near indicative clearing price will change, reflecting the impact of those newly added orders. The change of imbalance quantity captures the “net flow” of auction eligible orders. However, the imbalance quantity is the unexecuted quantity at the current reference price, while the reference price floats with the market NBBO. In other words, each imbalance message only gives us a glimpse of the order imbalance through the “open window” of the reference price. In order to gain a full understanding of the order imbalance, it is imperative to string those “snapshots” together. By leveraging those insights, we develop a linear impact model and a proprietary algorithm to estimate the market impact coefficients for open/close auctions.

Figure 1. Close Market Impact – Relationship between Coefficient and Auction Volume

The richness of imbalance messages increases with auction volume. As a result, the impact model of stocks with active auction trading is statistically more significant than that of stocks with less auction activities. We calibrate stock-specific market impact coefficients for all Russell 3000 stocks, and then we recalibrate them on a quarterly basis. We have found a strong correlation between a given stock’s average auction volume and its market impact coefficient. As shown in Figure 1, on a log-log scale, market impact coefficient and auction volume exhibit a clear negative linear relationship. Using this finding, we create a model that relates the market impact coefficient to the auction volume and then use it to estimate the market impact coefficient for stocks outside the Russell 3000.

Figure 2. Market Impact Comparison

Examples

To give some concrete examples of the market impact estimation, let us look at some hypothetical orders with size equivalent to 30% of the last five minutes volume of the continuous session for four stocks. The market impact cost estimations for open/close auction are shown in Figure 2. For comparison, the market impact cost for the corresponding continuous session estimated by our proprietary model [4] is also shown. For illustration purposes, order sizes of those examples are chosen based on the stocks’ liquidity, so that the market impact cost estimates have similar orders of magnitude.  We observe that for all stocks, the market impact cost at the open auction is greater than at the close auction. In addition, the relative value of the impact cost for auctions compared to the continuous session varies from stock to stock.

Our Precision Liquidity Seeking algorithm utilizes market impact models for all three trading sessions (open, continuous, and close) to minimize the market impact cost while satisfying traders’ risk constraints. The optimal trading schedule for executing 1,000,000 shares of Stock A through all three trading sessions, with a target average participation rate of 10%, features no allocation to the open auction, a front-loaded schedule during the continuous session, and a significant allocation to the close auction (more than 10% of the order size is allocated to the close auction).

In contrast, the optimal schedule of executing 100,000 shares of Stock B, with the same 10% target participation rate, has no allocation to the open auction, a skewed U-shaped schedule during the continuous session, and a moderate allocation to the close auction (only 2% of the order size, which is much smaller than that for the Stock A example). The underlying reason for this difference is shown in Figure 2: relative to the corresponding continuous session, Stock A’s close auction is more advantageous than Stock B’s close auction in terms of minimizing the total market impact cost.

Conclusion

Using the imbalance messages from stocks’ primary exchanges, we derive a linear market impact model for open and close auctions, which complements our continuous market impact model. Stock-specific market impact coefficients for auctions are calibrated for Russell 3000 stocks, which are inversely correlated with auction volumes. This relationship is then used to calculate the market impact coefficients for stocks outside the Russell 3000.

With market impact models for all three trading sessions in place, our Precision algorithm is capable of allocating order shares to different sessions in a truly optimized way, which properly addresses traders’ concerns when deciding on the optimal trading trajectory. This capability has been desired for a long time, and we believe that Precision is the first algorithm to provide a complete and coherent treatment of market impact models with this capability.

References

[1] Zhao, K., Cai, H. & Yu, Y. (2019, Q3). Closing Volume Discovery. Global Trading Q3 2019, 15-20. https://old.fixglobal.com/home/closing-volume-discovery/ 

[2] NYSE (March 8, 2019). XDP Imbalances Feed Client Specification.

[3] Nasdaq (August 12, 2019). Net Order Imbalance Indicator Support Document.

[4] Zhao, K. & Liu, L. (2019, Q2). Market Impact in Continuous Market. Cantor Fitzgerald Precision White Paper.

Enterprise Blockchain in 2020: Simplify, Connect, Transact

By the standards of most technologies, blockchain ought to still be in its infancy. The modern computer, for example, has been around for over 80 years. Cloud computing is still gaining market share in many industries, despite having been around since 1996.

Yet the blockchain industry has matured at an unprecedented rate. After an explosion of interest and innovation in the first few years, business leaders are now becoming familiar with the technology’s strengths and are clarifying their ideas about where and how to implement it.

In response, the industry is consolidating as it becomes clear which models are meeting the enterprise need. 2020 is going to be an important year for enterprise blockchain to capitalise on this consolidation via real-world network deployments. We at R3 are organising ourselves around one mantra to help make this a reality: “simplify, connect, transact.”

Simplify

Complexity is the enemy of adoption in enterprise technology. For blockchain to be practical and scalable across many different organisations with very different goals and pre-existing technologies, the underlying platforms need to strive for simplicity and flexibility.

Organisations want to simplify how they launch or join blockchain networks. Business sponsors do not want deployment and operations to delay their time to demonstrable value, and will want their blockchain nodes to ‘live’ where their current infrastructure lives.

This means that deploying a blockchain node needs to be as simple as booting up a laptop for the first time, and connecting to peers on a network should be pain-free, regardless if the node is deployed on-premises or across a single- or multi-cloud environment. The leading blockchain platforms in 2020 will need to be able to ‘deploy anywhere’ with the fewest clicks, allowing customers to focus on business value, not deployment headaches.

Enterprise blockchains will also gain a simplicity edge over public blockchains in 2020. The currently-leading public blockchains have injected ever-increasing complexity into their delivery roadmaps for 2020 and beyond, as they grapple with working backwards to achieve the privacy and scalability needed for enterprise buyers.

Organisations will move their attention away from the question of whether a chain is public or private and, instead, start asking ‘who are the validators of transactions?’ and ‘how future-proof is my platform choice?’

Connect

The second big trend in 2020 will be a re-focus on network effects. This will include bootstrapping net-new networks as well as the reemergence of interoperability as a priority to allow those networks to have increased reach.

Just as with deployment, customers will demand a simplified and low-friction experience for starting or joining consortium networks. This enhanced user experience will not be limited to technology considerations but also focus on streamlined governance and ‘consortium economics’ that will allow for enterprises to make better and more informed business decisions about which blockchain networks to join.

This may finally deliver the initial adoption by the to-date mythical ‘fast followers’ within the corporate world, and we are already seeing evidence of this within supply chain and trade networks, such as Marco Polo, which recently brought together over 70 institutions to trial a recievables financing solution on Corda.

Smart contract decoupling will create a lot of buzz, as senior executives do not want to be tied to any one underlying platform, but the market remains too immature and abstracted for this buzz to translate to reality.

Instead, the big steps forward in terms of interoperability will be blending the worlds of “on-chain” and “off-chain” as platforms and projects drive through integration with established business networks. For example, cross-payment rail interoperability, which can harness the best of incumbent rails and emerging blockchain settlement and asset networks.

As a result, participants in the blockchain ecosystem will start thinking about how their software can drive market-level (in addition to firm-level) connectivity and transformation. This is a cultural, as well as technological, change and so will take time to become a reality. However, this is one of the big promises of enterprise blockchain platforms, such as Corda, which enables controlled information sharing and workflow between different organisations without compromising on privacy.

Transact

Blockchain is unique in it’s ability to deliver market- and industry-wide networks centered on provably scarce digital assets and value being transacted directly and with finality. 2020 will reveal the demonstrable value of blockchain networks over traditional centralised databases as we move into the ‘transact’ phase.

Key to this will be the continued adoption by regulated institutional exchanges for high-quality digital assets. Such token initiatives were a significant part of the story of 2019 with a number of high-profile projects, such as SDX in Switzerland, making major steps forward.

Many of those projects have set robust goals for adoption and first transactions in 2020. For a new technology to have progressed from the fringes of the development community to underpin regulated exchanges, built by some of the biggest names in institutional finance, in just over a decade is astonishing.

Regulated asset network deployments will be coupled with an increased pace of innovation in digital value, especially for regulated or central bank backed digital currency. The current reactionary movement against private or crypto stablecoins will pick up pace, as the public sector will look to regain the march against cryptocurrency by fast-tracking experiments (and in some cases deployments) of digital currency.

2020 is going to be something of a coming of age year for blockchain. We’ve been part of an incredible wave of innovation and experimentation over the past few years, with firms putting forward many different models for how to best deploy and govern blockchain networks in the enterprise context.

Many of those debates, however, are becoming settled now as the industry consolidates around the most suitable models. Under the exciting pressure of deployments planned for 2020, the players that emerge as leaders will be those with a pinpoint focus on simplifying the blockchain journey, fostering connections and enabling live, seamless transactions.

QUICK TAKE: Stay Balanced in ETFs: SSGA

If one is going to trade exchange-traded-funds, the key theme for 2020 is balance.

In other famous words, float like a butterfly and sting like a bee….

Investors should neither overweight nor underweight the sector and position their portfolios to reflect the growing market risks, including outflows, according to State Street Global Advisors.

In their latest report and review, “The 2020 ETF Market Outlook: Threading the Needle,” analysts there including Matthew Bartolini and Michael Arone, said that particular care must be exercises in the sector after the last few stellar years and despite investors feeling like they might have missed out, caution is the key this year.

“As the fear of missing out on future gains creeps in, investors might consider altering their view on risk assets and then jump back in with both feet, SSGA wrote. “However, blindly buying equities in 2020 could be a bigger risk than not owning them in 2019.  Investors should consider strategies that limit the impact of volatility while pursuing equity returns.”

Matthew Bartolini, SSGA

Furthermore, generating sufficient levels of income within bond portfolios should be “more about balancing duration, credit and geopolitical risks and less about reaching for double-digit returns” they said. Bartolini advises using active strategies that have the ability to rotate and “puck up” yield across bond sectors may help to balance these risks.

“Investors have the ability to precisely tailor bond portfolios for the year ahead and, based on their risk profile, create customized and flexible active tilted to broad-based Aggregate bonds,” he wrote.”

Michael Arone, SSGA

Michael Arone, Chief Investment Strategist at SSGA concluded that with broad-based stocks and bonds at all-time highs—and an ever-evolving backdrop also experiencing all-time high uncertainty—having an alternative solution with low correlations to traditional markets as part of the asset allocation mix may be beneficial in 2020.

“The historical low-correlation structure of gold to stocks and bonds has manifested itself in positive average returns during bouts of volatility,” Arone wrote.

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA