Home Blog Page 497

Volumes Rise In European Dark Pools

Trading on European dark pools reached its highest level since MiFID II went live despite the regulation aiming to shift volumes onto lit exchanges.

Tim Cave, Tabb Group

Tim Cave, analyst at consultancy Tabb Group, said in a report that trading on European dark pools was 9.1% of all on-exchange activity last month, the highest level since MiFID II came into force in January 2018.

“While the amount of dark trading fell in the early months of MiFID II, it has gradually risen since the first set of caps expired in September 2018, reaching its highest level in April since before the implementation of the new regulatory regime,” he added.

He explained that the biggest factor affecting volumes are the double volume caps for trading in dark pools caps introduced under MiFID II. The caps limit the amount of dark trading in a stock to 4% on any one dark pool and 8% across all dark venues. Volumes are monitored on a retrospective 12-month basis every month, and stocks breaching the caps are banned from trading in the dark for the next six months.

Richard Johnson, principal in the market structure and technology practice at consultancy Greenwich Associates, focusing on equities and financial technology, said in a report that multilateral trading facilities and exchanges are only capturing about one third of buy-side flow under MiFID II.

“While it is possible that there is some margin of error in this figure, it is nevertheless a sign that regulators’ objective of steering equity trades away from dark and toward lit venues has not succeeded,” added Johnson.

Sourcing Liquidity

The Greenwich survey, MiFID II Shapes European Equity Trading, also found that nearly half, 44%, of buy-side traders find sourcing liquidity harder under MiFID II, while only 16% said it has become easier.

A poll at the FIX Community EMEA conference in March said the regulators should focus on the double volume caps on equity trading in dark pools as the first priority under the mandated review of the operation of MiFID II.

Rebecca Healey, Liquidnet

Rebecca Healey, head of EMEA market structure and strategy at Liquidnet, the institutional investor block trading network, said in a report that the responsibility to source liquidity in an evolving market eco-structure increasingly sits with the buy side alongside the obligation to provide best execution to end investors.

She said in the report, What We Learnt At Tradetech: “Unbundling has irrevocably changed the traditional buy and sell-side relationship, with nearly 60% of the audience highlighting the impact of systematic internalisers and electronic liquidity providers as the new providers of liquidity.”

MiFID II required the separation of trading commissions from payments of research. As a result the majority of asset managers have chosen to pay for research out of their own revenues.

Periodic Auctions

Cave added that volumes also increased in periodic auctions last month. Average daily notional was more than €1bn ($1.1bn) for the first time since since October 2018. Periodic auctions accounted for 2.4% of on-exchange activity, compared with 2.2% in March according to Tabb.

Periodic auctions are different from the traditional opening and closing auctions on exchanges as they can last for very short periods of time during the trading day and can be triggered by market participants, rather than the venue. They are considered to be lit trades under MiFID II, as the indicative matched size is published prior to execution.

The European Securities and Markets Authority issued a call for evidence on periodic auctions in November last year as the regulator said there are concerns that frequent batch auctions may be used to circumvent the double volume caps. However, market participants largely responded that they value the function played by periodic auctions in the market.

Johnson said that more than three quarters of buy-side desks are using periodic auctions. “There is broad satisfaction, with 40% noting they have had a good or excellent experience, as opposed to 7% with a negative opinion,” he added.

Systematic Internalisers

MiFID II banned broker crossing networks and required broker-dealers to set up systematic internalisers in order to provide principal liquidity to clients.

Cave said there was also a significant increase in activity on SIs last month, particularly above the large-in-scale thresholds. Addressable daily notional SI volumes totalled €9.6bn last month, compared with €7.7bn in March according to Tabb.

Richard Johnson, Greenwich Associates

Greenwich Associates added that SIs are now capturing about 14% of all buy-side flow and have proved a popular venue for institutional investors to access liquidity. The survey said almost 90% of buy-side traders are connected to at least one SI, with Bank of America Merrill Lynch, JP Morgan and Goldman Sachs having the highest penetration.

“The popularity of SIs in Europe will likely not end there,” said Johnson. “Now that they have demonstrated their ability to satisfy buy-side demands for liquidity and offer a way for brokers to differentiate themselves, we may see them spread to other markets.”

More data

However Healey note that the overall proportion of liquidity migrating to systematic internalisers and periodic auctions still remains less than the activity in the closing auctions. Periodic auctions represented only 2% of total market volume and SIs’ around 15% in the first quarter of this year, according to Liquidnet.

“Given the current Esma review on frequent batched auctions as well as forthcoming amendments to SI regulation, these market share percentages are likely to change as liquidity formation adjusts accordingly,” she added. “While liquidity is not yet shifting back to the continuous lit market, it is being repackaged as the sell side connects to new sources of liquidity in a bid to deliver enhanced execution.”

Healey continued that future disruption and innovation in European capital markets is likely to come from increasing use of data and analytics.

“One of the biggest changes in trading strategies since MiFID II has been the growing focus on transaction cost analysis as the buy-side continually seek to understand brokers execution strategies and focus on implicit as well as explicit costs of trading,” she added. “The requirement now is for solutions which provide the buy side with a truly independent means of analysis alongside analysis provided from execution partners.”

The Post Trade Debate Deserves to be Data Driven

Phil Mackintosh, SVP, Chief Economist, Nasdaq

By Phil Mackintosh,
SVP, Chief Economist, Nasdaq

Phil Mackintosh, SVP, Chief Economist, Nasdaq
Phil Mackintosh,
SVP, Chief Economist, Nasdaq
In the past 20 years, markets globally have automated and modernized. Computers are now responsible for the majority of executions in stocks around the world, and the quantity of data that is available to analyze trading has exploded. There are now more than over 50 venues for trading, and over 35 million trades every day in the US equity market. Just recently FINRA highlighted that they had 135 billion records in a single day. At the same time, transaction costs have also fallen significantly. That makes squeezing the last basis points out of your trading desk increasingly more complicated. We are arguably in the last mile of institutional TCA improvement, where routing, signaling and opportunity costs are a material factor to measure. Unfortunately, regulatory solutions like the new 606 and existing 605 rules remain aggregated, making them inadequate for institutions to quantify these costs. Institutions need granular access to their routing and trading data – timestamps in microseconds for order send, receipt and cancellation across all trading and IOI venues. Once you have that, harnessing the modern powers of “big data” will make it easier to find patterns and quantify costs. Only then can they improve strategies that reduce trading frictions. Enhancing FIX to do this is a compelling idea. FIX is already widely used globally for sending order level data back to institutions. A globally consistent file format will allow other experts to cost-effectively build analytic solutions. Most importantly, institutions will have access to all their data, giving them the power to analyze what they choose. The economics of routing deserves to be data driven.

2019 Women in Finance Awards Asia Winners Announced

GlobalTrading Journal and Markets Media Group hosted the inaugural Women in Finance Awards Asia on Friday, May 3 in Hong Kong.

The sold-out luncheon event at the Four Seasons Hotel convened about 150 top-level executives across the capital markets industry featuring leading buy-side managers and sell-side banks, HKEX as well as technology providers. Buy-side firms in the audience included BlackRock, T. Rowe Price, AllianceBernstein, JP Morgan Asset Management, State Street Global Advisors, and Fidelity.

Planning and development of the event was spearheaded with the support of the industry advisory group, which features top executives from the leading organizations based in Asia.

The awards event recognized 14 women across 13 firms:

CEO of the Year: Azila Abdul Aziz, Kenanga Futures
Excellence in Leadership: Nellie Dagdag, DTCC
Individual Achievement: Lauren Degney, Citi
Crystal Ladder: Jessica Morrison, Morgan Stanley
Trailblazer: Carrie Cheung, BNP Paribas
Excellence in Sell-Side Trading: Josephine Kim, BAML
Excellence in Buy-Side Trading: Christine To, T Rowe Price
Best in Trading Platforms: Shilesh Shekhawat, Virtu Financial
Excellence in Fintech: Lisa O’Connor, SWIFT
Excellence in Asset Management: Belinda Boa, BlackRock

Rising Stars:
Nicola McGreal, State Street Global Advisors
Sandy Ngai, AllianceBernstein
Vivian Peng, JP Morgan Asset Management
Lily Xiong, BlackRock

The event was greatly supported by sponsors including Citi, BlackRock, BNP Paribas, DTCC, SWIFT, HKEX, Bank of America Merrill Lynch, Nomura/Instinet, Virtu Financial, Refinitiv, Pathfinder and Morgan Stanley.

Look out for upcoming video interviews with the most recognized women in the industry.

Markets Choice Awards Women in Finance Awards will be hosted in New York City, November 2019 learn more here.

Interested in being involved in the future events in Asia or globally? Please contact us at info@fixglobal.com.

Congratulations to all the winners!

Solving Execution; Contextual Analysis, Intelligent Routing and the Role of the Algo Wheel

will-psomadelis1By Will Psomadelis, Head of Trading; Australia,
Global Head of Electronic Strategy Research, Schroders

Trading is an integral part of the investment management process. Anyone that has ever constructed a portfolio strategy knows the extreme joy of discovering a pocket of alpha that we can’t believe hasn’t been exploited only to be faced with the immense frustration that transaction costs can erode it away.

It is also the almost the perfect case study of where quantitative inferences can;
1. successfully augment trader knowledge for some trading decisions;
2. be ignored due to inadequate data, excessive variance or during event trades where human intuition is important;
3. replace the human where the human would generally be unable to recognise patterns either due to speed or where multiple factors with complex relationships are driving the outcome.

There has never been a better time for traders to bring research in-house and exploit their own ideas using their own proprietary information, in the search for alpha vs. relying on third-party vendors. Trading not undertaken intelligently is actually
a tremendous source of alpha to other firms looking for it.

Maybe it’s human nature, but the industry has a long history of taking somewhat sensible ideas, and over time, distorting the outcomes such that they barely reflect the original intention. Take the VWAP algorithm, for example. What started out as a simple validation of how an order was executed against a possible range of prices was turned into a widely accepted, yet gameable, benchmark. This spawned a race by the sell-side to provide an algorithm that produces a near-deterministic signal that dominated the industry for a decade (and still does in some regions, despite better alternatives). Much time has been dedicated to this topic across the industry, including our own organisation (‘Ignorance is not bliss’, Schroders, 2018, ‘Know thy counterparty’, Schroders, 2013).

Transaction Cost Analysis (TCA) and the Algo Wheel are, independently, incredibly useful, however, there is a risk that they suffer the same fate if neither achieves their objective, being improved portfolio returns.

TCA is a very broad area of study that covers everything from individual fill analysis of execu- tion strategies to parent order costs that are the culmination of every decision taken in the life of an order. The challenge is to use this data on a systematic basis to improve the decision process and create measurable value.

The Algo Wheel is a necessary innovation to collect an exploration data sample of orders with reduced human bias. This data should be used to create improved routing decisions and therefore execution outcomes. The risk of the Algo Wheel is when the line is blurred between the exploration mechanism and the exploitation router.

There is a need to understand and critically evaluate the direction of industry research within the context of our own investment and trading objectives. Schroders has deployed an internally-designed product named QTRMASTER© to provide a Recommendation/ Routing solution to the trading team. This model relies on extending traditional TCA research to improve order timing and, using Algo Wheel data, to improve algorithm strategy selection.

Continuous investment in the area of execution research is crucial in a competitive landscape, QTRMASTER being a key expression of this.

UNDERSTANDING ORDER TIMING TO IMPROVE OUTCOMES (TCA VS. CONTEXTUAL TRADING ANALYSIS)
Implementation Shortfall (IS) is the attempt to align the trader’s objective in the hope of producing commercial-grade portfolios that can exist beyond an academic paper.

Whilst IS does provide the necessary data to capture the realised cost to the portfolio of execution, it fails to apply any context and therefore fails as a systematic feedback mechanism to the desk.

The grim reality is that traders are hostage to the prices available in the market within a sensible volume window of the order time (we measure periods up to 50 times the volume of the order). For a given order, these prices could be explained by:
• natural variation in the market where for longer duration orders we would expect the variance to increase;
• excessive signal leakage where a consistent reversion profile could be observed; or
• by the alpha decay of the strategy itself when price patterns appear regular.

Knowing this at order arrival can influence decisions taken throughout the order life.

“Contextual Trading Analysis” (CTA) was designed by Schroders and sits within QTRMASTER. The aim of CTA is to infer the value added from trader timing and produce statistics that can be provided to the trader for each new order. Simulations of every order executed across the group allow us to understand the return profile of historic orders for a given set of conditions, including portfolio strategy, and establish potential cost improvements through varying aggression levels. A series of adjustments are made to normalise for differences across strategies.

Traders are ultimately provided with the probability that an order being traded at one aggression level will have a better outcome than an order being traded in another. The trader/algorithm can use this initial belief and update it with real-time information to dynamically adjust aggression.

The example below shows the potential improvement curves for 2 individual fund groups as well as the variance (shaded) across each simulation.

Looking at the density plot below for a particular order type, QTRMASTER provides the trader with an estimation that an extremely passive strategy has a ~68.5% probability of providing a better outcome than trading in an extremely aggressive strategy. The alpha decay curve on the right for the same order type shows the potential improvements from executing across various strategies. Against the same estimated cost, an order executed extremely passively could achieve 50bpt improvement vs 20bpt worse if traded extremely aggressively.

This order type has a slow alpha decay profile for this fund indicating there is no initial requirement for immediacy. (IMAGE 1A)

Source: Schroders Investment Management, 2019
Source: Schroders Investment Management, 2019
Source: Schroders Investment Management, 2019
Source: Schroders Investment Management, 2019

Using the combination of generally accepted trading cost metrics (IS) and our own Contextual Trading Analytics, we are able to push parent order TCA into 2 studies. Firstly, raw Implementation Shortfall is incredibly useful to determine the cost to the portfolio of execution. Secondly, CTA closes the feedback mechanism by providing context as to the genesis of those slippage costs. CTA also provides some insight as to how a trader can amend order timing to potentially improve them going forward.

APPLYING FORECASTING TO IMPROVE STRATEGY SELECTION (THE ALGO WHEEL)
The global nature of our trading desk is such that Execution research centralised in Australia can be distributed to our global desks ensuring the same (or similar) models are used across the group, harmonising execution philosophy. This centralised research function is augmented by a group of global traders, experts in their own regions, who ultimately determine the selection of strategies in each Algo Wheel implementation.

In a world of infinite orders, there would be no issue constructing datasets that reflected evenly distributed order types across all execution strategies. In reality, we are constrained with an order budget to obtain meaningful data. The randomised Algo Wheel is deployed within QTRMASTER to overcome this constraint.

The wheel is tasked to create a balanced dataset across strategies, ensuring a relatively even distribution of orders across different variables/features and removing human bias. This ensures that certain brokers/strategy results are not skewed by the order types they receive. This balanced dataset is used to train a Random Forest Regression model that is at the core of our intelligent routing model. Cross-validation is deployed to address overfitting. This framework has been in place since 2016 with a global rollout occurring as markets became suitable. The research however, is an ongoing process.

We do not believe in ranking algorithms or issuing scorecards on aggregate datasets through the use of averages across arbitrary bins in single dimension models. Such a methodology ignores the sensitivity of the strategy to other factors.

Using univariate analysis, especially on arbitrary bins, seems to be a confusing legacy of history. With techniques easily available to properly group orders (clustering, CART etc), relying on arbitrary bins has 2 main deficiencies:
• The outcome (cost) of an order is dependent on many variables. ADV is generally the largest driver but factors such as spread, embedded alpha, and time of day and market cap also contribute amongst others; and
• Discretising continuous variables at arbitrary intervals ignores a significant amount of information that could improve the forecasts.

The arguments against ranking algorithms on aggregate datasets also highlights why the all too common question of “What is the best algorithm?” is futile and why QTRMASTER attempts to answer the question of “What is the best predicted outcome for each algorithm for an order with a specific set of features?”

QTRMASTER preserves a clear distinction between the Algo Wheel randomiser used to explore data, and a router based on the outcomes of a Random Forest model which is designed to exploit the lessons learned. By applying weights based on recent performance alone to the randomiser, which seems to be a popular technique, distributions between strategies may no longer be equal. The issue of an unbalanced data that the algo wheel attempts to nullify is somewhat re-injected into the dataset.

For QTRMASTER to route orders, it is necessary to understand the behaviours of each strategy relative to the range of variables or order features. Each new order is decomposed into its underlying features that will contribute most to the cost with the strategy selected being the one that can improve returns across the variables of each order. The flow of data is presented in IMAGE 1B.

Source: Schroders Investment Management, 2019
Source: Schroders Investment Management, 2019

Unfortunately, machine learning has become an overused and over-hyped term in finance in recent times. Machine learning within QTRMASTER had to be a last resort after trying other more traditional statistical models to ensure the data was understood and that complexity wasn’t being added to the process for no gain.

One common complaint of machine learning models is the “black box” nature of what happens within them. In the case of linear regression, considerable insight can be gained into the structure of the model by simply viewing the coefficients. There is no such neat output for machine learning, so alternative techniques have been designed to observe how the underlying model is likely to route orders and how each algorithm responds to each variable in isolation. (IMAGE 1C)

Schroders Investment Management, 2019
Schroders Investment Management, 2019

Using Partial Dependence Plots (above), we can observe the cost response (y-axis) to an individual variable, in this case, the size of the order measured as a percent of the ADV (x-axis). The colours represent the cost responses for each execution strategy.  Order size has a complex relationship to cost well beyond linear (green line) which is not a dissimilar finding to our investigation into the industry accepted transaction cost models.

Some of the effects of individual order features are individually strong whilst others are only strong due to interaction effects. Variable interactions can arise when the effect of one variable on order outcome differs depending on the level of other variables.

The surface plot below, for example, shows the how market cap interacts with volatility. The interaction is demonstrated by a observing a larger effect on outcome (z) between volatility on smaller market cap stocks than larger ones. (IMAGE 1D)

Source: Schroders Investment Management, 2019
Source: Schroders Investment Management, 2019

Even with increasingly sophisticated techniques, it remains that the most important step in any research is to understand the data and domain and construct a robust process that questions the outcomes. Complexity should not be added for complexity’s sake.

Disclaimer: Opinions, estimates and projections in this article constitute the current judgement of the author as of the date of this article. They do not necessarily reflect the opinions of Schroder Investment Management Australia Limited, ABN 22 000 443 274, AFS Licence 226473 (“Schroders”) or any member of the Schroders Group and are subject to change without notice. In preparing this article, we have relied upon and assumed, without independent verification, the accuracy and completeness of all information available from public sources or which was otherwise reviewed by us. Schroders does not give any warranty as to the accuracy, reliability or completeness of information which is contained in this article. Except insofar as liability under any statute cannot be excluded, Schroders and its directors, employees, consultants or any company in the Schroders Group do not accept any liability (whether arising in contract, in tort or negligence or otherwise) for any error or omission in this article or for any resulting loss or damage (whether direct, indirect, consequential or otherwise) suffered by the recipient of this article or any other person. This document does not contain, and should not be relied on as containing, any investment, accounting, legal or tax advice. Schroders may record and monitor telephone calls for security, training and compliance purposes. ©2019 Schroders plc. All rights reserved. All copyright and other intellectual property rights in QTRMASTER is and remains the property of Schroders plc and its subsidiaries.

GS: It’s All About the API

Financial services will see old dichotomies like the buy and sell sides as well as front office and middle/back office fall by the wayside in the coming years, according to Goldman Sachs.

R. Martin Chavez,
Goldman Sachs

“If you were comfortable living in a binary world where this company is my client and that company is my competitor, you really ought to re-think that,” said R. Martin Chavez, global co-head, securities division at Goldman Sachs, during a fireside chat at the DTCC’s Fintech 2019 Conference in Midtown Manhattan. “The way we see it going is to APIs. I cannot say it enough; it is going in the direction of who produces APIs and who consumes APIs.”

Although Chavez declined to predict when the trend would affect all of Wall Street, he cited sci-fi writer William Gibson: “The future is already here- it’s just not very evenly divided.”

The push to open APIs is the latest attempt by firms within the industry to differentiate themselves from their competition.

Goldman Sachs recently contributed its Securities DataBase API, written in Python, to the open source repository GitHub to provide clients access to the platform’s market data analytics and risk management services.

Some might see it as giving away the investment bank’s secret sauce, but the firm has a business “to the extent that our clients perceive themselves satisfied with our products and services,” he added.

Goldman Sachs views the more than 30-years worth of trades, risks, and models that is has curated, cleaned and stored in SecDB as a significant differentiator.

However, exposing a company’s APIs comes with some constraints as clients typically want open APIs from multiple providers that are freely licensed to produce and export, according to Chavez.

“Then the consumers can judge you on your service-level agreement, pricing, capacity, risk or whatever else it is,” he added.

At the same conference, the DTCC announced plans to launch a marketplace where clients can access APIs for the industry utility’s product and services later this year.

The initiative builds upon an internal API marketplace that the DTCC rolled out in 2018 to help its developer to find and re-use software within the organization, Michael Bodson, president and CEO of DTCC, said during his opening remarks.

“Longer-term, however, we plan to expand the platform approach to our ecosystem and offer a much broader and diverse set of APIs and capabilities to support them,” he added.

Sustainable Investing Will Dominate Next Decade

Rachel Lord, head of Europe, Middle East and Africa at BlackRock, said sustainable investing will dominate the next 10 years.

Lord took part in a fireside chat at the Innovate Finance Global Summit in  London  yesterday. She added: “The entire ecosystem will evolve so that sustainable investing is just investing.”

Mark Carney, Governor of the Bank of England, also highlighted sustainability in a keynote speech at the conference this week.

Carney gave an overview of how the UK central bank is responding to climate change. He said the transition to a low-carbon economy will require enormous re-allocations of capital, and investments in infrastructure, which have been estimated as much as $100 trillion globally over the next decade.

Mark Carney, Bank of England
Mark Carney, Bank of England

“Firms that anticipate these developments will be rewarded handsomely; those that fail to adapt will cease to exist,” he added. “This will have enormous ramifications both the financial system and for financial stability.”

In order to improve reporting of climate-related risks and opportunities by companies, the bank helped set up the private sector-led Task Force on Climate-Related Financial Disclosures four years ago.

“The TCFD has now led to a step change in the demand and supply of climate reporting,” he said. “With over $100 trillion in assets now demanding TCFD quality disclosures, a market in transition is now being built.”

In addition the bank is changing its supervisory approach. This month the Bank of England published supervisory and policy statements that set out expectations for banks and insurers regarding their governance, risk management, strategic resilience and disclosure of climate-related financial risks.

The Prudential Regulation Authority at the Bank of England has also set up the Climate Financial Risk Forum to liaise with firms across the financial system. Next month the PRA will require insurers to consider how their businesses would be affected in different climate risk scenarios as part of a market-wide stress test.

The Bank of England is also working with other central banks and supervisors in the Network for Greening the Financial System to improve climate risk management in the core of the global financial system.

“Our priorities include the development of a small number of high-level climate scenarios that can be used in future system-wide stress tests,” added Carney. “By adapting our soft infrastructure in these ways, the Bank will help ensure that the financial system is not only resilient to climate-related risks but also can take full advantage of the enormous opportunities in a new low carbon economy.”

Technology

Lord also discussed how BlackRock uses technology and looks for partnerships with innovative firms.

Rachel Lord, BlackRock

“We didn’t get big thinking that we could do everything by ourselves,” she said. “We have always embraced partnerships and we have a team who goes out to meet start-ups.”

She continued that in the US BlackRock is working with Microsoft to reinvent retirement products.

In December last year the two firms announced they are exploring the development of a new technology platform that helps people develop better saving and investing habits. BlackRock intends to offer the platform in connection with new products that aim to provide a lifetime of income in retirement and would be made available to US workers through their employers’ workplace savings plan.

Satya Nadella, chief executive of Microsoft, said in a statement: “Together with BlackRock we will apply the power of the cloud and artificial intelligence to introduce new solutions that reimagine retirement planning.”

BlackRock also acquired a minority equity stake in Scalable Capital, based in Munich and London, in 2017. Lord said: “They are a digital investment manager using exchange-traded funds as building blocks.”

Patrick Olson, chief operating officer of EMEA at BlackRock, said in a statement: “The retail distribution landscape is evolving at a rapid pace, as consumers increasingly engage with their financial investments through technology. This trend is prompting strong demand from European financial institutions – including banks, insurers, wealth managers and advisory firms – for high-quality technology-enabled investment solutions.”

Last year BlackRock also became an anchor investor in Acorns, a US micro-investing app with more than 3.3 million investment accounts. Acorns helps young Americans automatically invest spare change into diversified ETF portfolios and save for retirement through Acorns Later, the first automated retirement account.

Rob Goldstein, BlackRock

Rob Goldstein, BlackRock’s chief operating officer said in a statement: “Our partnership is rooted in the shared commitment of our two firms to use technology to help more and more people improve the way they engage with, save and ultimately invest their money. Acorns is a pioneer in creating innovative ways to engage investors in a mobile-first world.”

Changes

Lord continued that over the next decade fund managers will have to change the products they provide.

“We are used to selling funds through distributors but we need to provide solutions to financial problems,’ she added.

Another theme is that countries all over the world are rejecting globalisation. “Lord said: “Localisation has become more important and fund managers need to make more impact in their community.”

For example, Blackrock chose IntoUniversity, at the foot of Grenfell Tower, as its charity of the year for 2017-18. BlackRock employees can volunteer with IntoUniversity via mentoring schemes, student and staff training sessions and the Big City Bright Future work experience programme, set up by the two organisations.

Lord concluded: “Over the next five to 10 years everything is changing.”

Mindmeld 2019: Buy Side Reaches Turning Point

Abstract business chart with uptrend line graph, bar chart and diagram in bull market on dark blue background.

Mindmeld 2019: Leaders in the Buy-Side Join to Discuss the Industry’s Most Critical Trends

Mindmeld, Indus Valley Partners’ annual buy-side conference held on April 11th, joined together some of the most notable CFOs, CTOs, CROs and CIOs across the alternative and traditional asset management industry for an opportunity to learn and discuss crucial trends related to finance, operations, technology and compliance.

This year’s event began with a special keynote presentation from New York Times best-selling author, Neel Doshi, on the key aspects of an effective operating model that, when properly implemented, can help build a high performing company culture. Following Neel’s presentation, the day consisted of three panels moderated by Indus Valley Partners subject matter experts, training sessions for newly upgraded and existing IVP solutions, and a fireside chat focused on cloud migration moderated by CEO Gurvinder Singh.

Each panel joined buy-side leaders from various roles to discuss and debate several trends that the industry will be faced with in years to come. Key findings from these discussions include:

Hyper Outsourcing is Here to Stay: In order to remain competitive in this ever-evolving environment, funds will have to adopt a model that can operate seamlessly with an increased number of service providers while maintaining a sense of internal flexibility and control. Gone are the days of the single outsourcing provider model. This archaic strategy is no longer suited for the majority of alternative assets that are being developed and traded privately on a continuous basis. It is now common to see a typical fund leveraging 10-15 outsourcing providers across its entire value chain. Digitalization and the rise of “digital-first” service providers has leveled the cost playing field. The ability to fully leverage technology platforms now enables funds to achieve similar cost efficiencies that in the past were only achievable by the largest asset managers.

Scaling Enterprise Intelligence and Analytics with a Comprehensive Data Strategy: Many funds, big and small, are finding that implementing an impactful data strategy is no small feat. As alternative data, data science and artificial intelligence grow in popularity, funds are beginning to realize they cannot achieve success with these new tools without first ensuring their core data sets are accurate. The panel highlighted some key steps to developing a successful path:

  • Firms must first collect and secure core thought processes on data capture, governance and curation for data sets that need to be governed and curated (trades, securities, positions, PnL, counterparty, risk, performance).
  • Funds are increasingly using base data sets for additional analytics, leading to definitions of a data layer that continuously expands and evolves without tech involvement. Regardless if implemented as a scalable data warehouse or data lake, clearing and segregating governed and curated data sets, as well as semi and non-governed data sets, is vital.
  • One of the most critical aspects in the path to success is getting buy-in at a business level for treating data strategically. The most successful data initiatives have full strategic buy-in, which in turn enables transparency in governance, data lineage and cataloging, making implementation and change management much easier.

Critical Data Pathways: The volume, velocity and accuracy of data demanded by investors on a daily basis is showing no signs of slowing. Because of this, it is vital to have a process in place that allows funds to classify various asset types into a common set of attributes so that the data can be easily digested and used across the front, middle and back office. Funds can achieve this by building a flexible data layer that encompasses standard, curated and governed data sets. By building data governance capabilities to distribute the load of data governance to the exact users most familiar with data sets, funds can avoid overloading technology teams for the same information. Managers will also now have the ability to continuously ingest new data sets, derive analytics and report in insightful ways, all of which a critical core competency for asset managers. A practical path for funds dealing with any big data problems that require elastic compute and elastic storage is to leverage cloud capabilities, which provide success in a highly scalable manner. In order to combat any disorder, funds must have a holistic data governance team that consists of various representatives from each sector of the business in place to challenge and review data before it becomes classified.

Roadmap to the Cloud: As funds move away from their traditional data centers and begin migration to the cloud, it is important to keep in mind that a simple “lift and shift” approach is not the answer to success. A practical path was discussed as a case study for a large asset manager during the conference’s fireside chat. A large $50 billion AUM asset manager leveraged IVP EDM as a toolkit to migrate from a traditional data warehouse model to a scalable data layer concept. Harnessing the Azure version of IVP EDM, the fund was able to fully leverage cloud computing’s elasticity on storage and compute. With deep analysis executed on MPP vs. traditional SQL with columnar architecture, it was found that below 4-5 TB of data columnar performed similar. The fund is now able to offer users new analytic capabilities such as redoing ad hoc performance/XIRR calculations on 250MM rows of data in three to four seconds on any dimension.

Buy-side participants are finding that the industry has reached yet another unique turning point in its story line. As cost pressures and the demand for data reach an all-time high, education has quickly become one of the most, if not the most, effective ways in positioning funds for future success. With an annual gathering of prominent leaders in the space, IVP continues to strive to help buy-side firms harness the power of their data in order to generate insights, reduce risks, gain a competitive edge and discover alpha.

Mosaic Smart Data Aims To Diversify, Expand

Mosaic Smart Data, the fintech which provides real-time data analytics for capital markets, will use funds from its latest investment round to move into new asset classes and to grow internationally.

Matthew Hodgson, chief executive and founder of Mosaic Smart Data, told Markets Media: “The funds will be used for product development as we expand into equities, for R&D into machine learning, so we can provide more analytics, and to expand in the US and Asia.”

He launched the company after finding it impossible to get a consolidated view of data on all trades with one client across all products in his previous roles at Deutsche Bank and Solomon Brothers.

This week Mosaic Smart Data announced it had completed a $9m (€8m) investment round co-led by CommerzVentures and Octopus Ventures and which includes JP Morgan, an existing investor and client.

https://twitter.com/hschwender/status/1122803113222656000

Hodgson said: “We are very proud to work with our investment partners and that J.P. Morgan has invested for a second time.”

MSX, Mosaic’s platform, can generate real-time analytics on any set of fixed income, currencies and commodities from both voice and electronic trading. The new funds will be used to expand into equities. Electronic trading developed in equities ahead of FICC and so traders already have access to large amounts of data for transaction cost analysis and to measure execution.

However Hodgson argued that as FICC has become more regulated, it has leapfrogged equities in what is possible for real-time analytics.

He added: “Firms want a cross-asset view across all their products in a single place, and for electronic and voice trading to be brought together.”

Through understanding past client behaviour, banks can improve service and profitability by anticipating their future needs. Hodgson said capital markets firms are spending lots of money on market data but cannot convert it into analytics and make it smart.

More and more data

Matthew Hodgson, Mosaic Smart Data

Hodgson added: “Our key value proposition is then aggregation and normalisation of all types of data. Mosaic Smart Data can use machine learning to identify market outliers, provide richer colour and give humans the tools to augment their performance.”

Zihao Xu, Future of Money lead & early stage investor at Octopus Ventures, said in a statement: “One of the most exciting applications of artificial intelligence is combining it with human intelligence to create something which is more than the sum of its parts. Mosaic Smart Data’s machine learning models surface data insights that even the best human quant may not spot, delivering them to staff in a way which lets them act immediately.”

At the start of this year consultancy Greenwich Associates said in its predicted trends for 2019 that the growing importance of data comes up in every market structure and technology conversation.

Greenwich said in a report: “From the $300m spent in 2018 on alternative data to pre-trade transparency to indexing to real-time market data, 2019 will see producers, distributors and consumers of data hyper-focus on gathering more of the right data, storing it in new ways, analyzing it via machine learning and AI, and acting upon it more systematically than ever before.”

The consultancy continued that the amount of data will only increase, and the industry has only scratched the surface in terms of how data can be applied to making money in the markets.

Warren Rabin, co-head of global macro sales and marketing at JP Morgan, said in a statement: “Today’s financial markets are awash in data at a scale never seen before, but what really drives performance is being able to extract truly actionable insights from that data in real-time. Tools like Mosaic that quickly make sense of vast data sets are changing the way our teams respond and operate and are going to become a differentiating factor for banks as they look to add value in their client discussions.”

Buy-side need

Buy-side firms also need real-time analytics to examine their interaction with brokers and clients in order to improve service and profitability as margins are under pressure. Custodians, venues and exchanges could also generate reports from their data.

“We are in advanced dialogue with the biggest buy-side firms,” added Hodgson. “They are moving to providing liquidity but do not have execution analytics in real time. They need performance analytics at an atomic level to determine the quality of their execution and market impact.”

In 12 months’ time Hodgson said Mosaic would like to have established a standard for data analytics across capital markets.

“We will be in production across key tier 1 banks, some buy-side firms and we will have an international presence,” he added.

Exchange M&A Comes To The Fore

Deutsche Börse said it is preserving its firepower for acquisitions as analysts said management’s efforts to accelerate growth inorganically assume a greater urgency following the German exchange’s results.

The exchange reported its first quarter results on 29 April and said secular net revenue increased by approximately 5%, in-line with the company’s plan.

Chris Turner, analyst at German bank Berenberg, said in a report that the results were in line with consensus expectations due to help from a one-off gain. “With the group facing a mixed cyclical outlook, management’s efforts to accelerate growth inorganically assume a greater urgency, in our view,” he added.

Last month Deutsche Börse announced that it was acquiring Axioma for $850m and combining the provider of cloud-based portfolio and risk management software with its STOXX and DAX index businesses to form a new company. The two firms have had an existing partnership since 2011 and have jointly developed products, including factor indices and exchange-traded fund offerings.

Sebastian Ceria, founder and chief executive of Axioma, said in a statement: “The union of Axioma, STOXX and DAX under the Deutsche Börse umbrella creates a growth company that is uniquely equipped to help clients capitalize on the critical trends now reshaping the investment management landscape.”

Turner said the tie-up between Deutsche Börse’s STOXX business and Axioma demonstrates a creative approach to inorganic growth, which is encouraging, although the impact on earnings will be modest.

“Deutsche Börse’s efforts to acquire the FXall business hold out the prospect for a more material creation of shareholder value,” Turner added. “With little visibility on probability of completion, let alone the structure, of any such deal, we remain Hold rated on the shares.”

The German exchange confirmed this month that it is in concrete negotiations to acquisition some foreign exchange businesses from data provider Refinitiv

Deutsche Börse said in its results presentation that the Axioma transaction strengthens its pre-trading offering and improves buy-side access. The presentation said: “In addition, smart transaction structure crystalises value of index asset, ensures value generation and preserves M&A firepower.”

Data 

Consultancy Opimas said in a report last month that exchanges are expected to make acquisitions to expand their data businesses over the coming year and enter the alternative data space.

“Data still account for a minority of revenues for most exchanges,” said the report. “Over the coming year, we expect to see a number of these exchanges make acquisitions to bolster their data businesses, as they continue to seek additional revenue streams.”

Octavio Marenzi, Opimas
Octavio Marenzi, Opimas

However the increasing cost of market data from exchanges has led to complaints from members.

Octavio Marenzi, founder and chief executive of Opimas, said in the report that these complaints are misplaced as market data revenues have been falling at some of the largest exchanges including CME, Deutsche Börse and London Stock Exchange Group.

“For those exchanges where we have seen strong growth, this has typically come from areas such as indices or analytics and has been fuelled by a large number of acquisitions in the area, rather than organic growth,” he added.

David Schwimmer, chief executive of the London Stock Exchange Group, said in its results presentation this week that the UK firm is investing in and growing the Information Services business, including developing multi-asset and data and analytics.

ICE

Intercontinental Exchange, the US operator of global exchanges and clearing houses and provider of data and listings services, today announced that it has agreed to acquire Simplifile, which operates an electronic platform for residential mortgage records. The acquisition will expands the ICE Mortgage Services portfolio.

Paul Clifford, founder and president of Simplifile, said in a statement: “We’ve seen how ICE has helped to transform markets going through an analog to digital conversion and has made them more transparent and efficient for all participants. We are closely aligned with ICE’s vision as it applies to the residential mortgage industry and, as we become part of ICE, our team at Simplifile will continue our efforts to simplify the industry for all of its stakeholders.”

More Transparency Should Improve Execution Quality

Ryan Larson, Head of Equity Trading (US), RBC Global Asset Management (US)

Ryan Larson, Head of Equity Trading (US), RBC Global Asset Management (US)
Ryan Larson,
Head of Equity Trading
(US), RBC Global Asset
Management (US)
Nasdaq recently offered their recommendations on creating a more modern market structure; and separately, a collective group of market participants launched the Routing Transparency Initiative. These efforts should be applauded as contributing to the debate as to how to improve our markets.

A collective group of diverse market participants should agree to adopt common ground to deliver a global standard to routing FIX orders that provide additional transparency and build upon the efforts of the SEC’s Rule 606 enhancements, ultimately providing child-order routing decision transparency. My expectation is that with this additional level of insight, participants would be able to analyze data that aligns with order intentions, and definitively evaluate and determine routing and venue quality, including the potential impact of un-executed orders.

However, there is some element of “be careful of what you wish for”. For firms not mandated to consume venue-level information as part of their best execution review process, there is an additional sense of fiduciary duty in that “the data is out there, now what do we do with it”. Some firms may not have adequate resources, or are structurally not ready, to consume this level of information; and the benefits that could result would simply end up as an unused cost to the industry.

Perhaps a short-coming of Reg NMS is a somewhat narrow definition of best execution. While price is certainly a factor, there are infinitely more variables that could be considered in satisfying best execution obligations; especially given the diversity of ecosystem participants and their different objectives in the marketplace. Other amendments that are worth resolution include a one-size-fits-all market structure and the potential elimination of the OPR; liquidity aggregation for smaller capitalization companies; SIP reform and oversight; and SRO, ATS and data governance.

We remain very supportive of the Routing Transparency Initiative, and encourage others to consider the benefits it would bring.

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA