Home Blog Page 522

Transforming Trading With Machine Learning

By Lee Bray, Asia Pacific Head of Equity Trading, J.P. Morgan Asset Management

lee-brayThe firm’s Asia Pacific equity trading team aims to have around 50% of trading automated with machine learning by the end of this year.

J.P. Morgan Asset Management has invested significant resources in building machine learning tools to enhance global equity trading, as the rise of artificial intelligence and automation transforms how we conduct business.

The firm’s Asia Pacific buy-side equity trading team developed a model utilizing machine learning to help its portfolio managers make the execution of trading orders more effective and cost efficient. The proprietary model, which was developed by the firm’s quantitative analysts and traders, uses data patterns to find the optimal execution strategy for trading orders. The Asia Pacific trading desk is on track to have 50% of equity trading flow driven by machine learning by end of 2018

With myriad options available for executing any given order, particularly smaller or more routine orders, an intelligent model can identify the best execution more efficiently than a human. The artificial intelligence model “learns” constantly about the best outcomes for trading orders and adapts by recalibrating as market conditions change and new information is delivered. Using these algorithms to pinpoint the probability of best performance, the machine learning environment auto-routes and executes accordingly.

To develop the model with machine learning, we tapped into techniques more commonly found at companies like Facebook and Google. By creating a systematic, adaptive model able to alter actions based on mathematical patterns rather than relying on human input, we’re transitioning equity trading to be more scientific and quantifiable.

Currently the model gives trading recommendations to human traders, but increasingly it is taking over an automated role in executing trades. As its application becomes more scalable, it has the potential to generate cost savings for JP Morgan Asset Management’s clients through greater trading efficiency.

We’d love to hear your feedback on this article. Please click here

globaltrading-logo-002

Announcement : FIX Trading Community : Neena Dholani

Neena Dholani, FIX

BUSINESS AND TECHNOLOGY AT THE SUMMER LONDON REGIONAL EVENT 2018.

Neena Dholani, FIXNeena Dholani, Global Marketing and Programme Director

The first half of 2018 has been far from quiet. Whether it’s the Beast from the East bringing trains to a standstill or fun in the snow for our children, the endless conversations about Brexit, or the Royal Wedding bringing cheer on a sunny day. As for the City of London and the wider European markets, the first six months have been just as busy, with the implementation of MiFID II standing out as an all-consuming challenge. It’s still early days, but things appear to be moving in the right direction, albeit with some issues still needing attention. A lot of that early success appears to be the general climate within the industry as one of collaboration and information sharing, and also in large part down to firms’ willingness to adopt change and technological innovations to drive the industry forward. Quite simply, business and technology working in synchronicity together to get the job done!

FIX Trading Community Working Groups have been more active than ever as testament to this, reaching a crescendo in discussions at the 10thanniversary FIX EMEA London Trading Conference held in March, where over a thousand members and participants gathered in Old Billingsgate to discuss key topics, network with their peers, as well as exhibit and showcase the latest technology. Up for discussion on the day were a myriad of themes, principally the key European regulatory challenges emerging, in particular, from the 20,000 pages of MiFID II. Panel conversations were far reaching, with some of the highlights being best execution, post-trade data flagging and understanding the use for the right data, the electronification of fixed income, cybersecurity, and the use of blockchain. The one overriding concern that emerged was the central importance of good standards and good data.

Discussions continued in April at the FIX Trading Community Americas Briefing in New York, where participants agreed cryptocurrencies need to be in full focus, with planning around them now a priority in the US. The electronification of the fixed income markets, as well as digital crypto and blockchain, are also areas that members agreed were a high priority.

Then in May, the FIX Trading Community Milan Regional Meeting looked at data quality issues posed by trade and transaction reporting – a key stepping stone to help achieve MiFID II objectives. Panellists addressed how firms should use the data they have access to as well as transparency challenges of data fragmentation, the different interpretation of data, different standards, challenges in order and record keeping, and how FIX can support these challenges. The Members of the FIX Trading Community are actively working on all of these topics throughout the year on the Steering Committees, Subcommittees and Working Groups across EMEA, Americas and AsiaPac.

Business must continue as usual, as fresh challenges require dynamic responses, which is why the FIX Trading Community is holding its London Summer Regional Event on the 27thJune in London, this year kindly hosted by Colt Technologies. Designed for both buy-side and sell-side, exchange and vendors alike, the agenda will provide expert insight into the challenges ahead. It stands to be a lively meeting, six months on from the MiFID II deadline, discussing where we are with implementation as businesses adopt new practices to adjust to the regulatory framework. As with all FIX Trading Community events, the neutral environment enables free discussion and the evening will bring together industry experts who will focus on current issues that will impact electronic trading in the EMEA region for years to come, and provide essential insights on current and future trends. As well as hearing from the speakers, the event itself will provide an essential networking opportunity to discuss the Working Groups’ activities ranging from cybersecurity, to blockchain, MiFID II and fixed income, as well as TCA best execution and more.

The individuals who give up their spare time to work together to solve some of these pressing issues through the FIX Trading Community Working Groups make a huge difference. I joined FIX Trading Community as Global Marketing and Programme Director recently, having previously come from the world of electronic fixed income trading, where I worked for many years with banks selling into the buy-side engaging with vendors to increase electronification of the fixed income market. I could see before I joined the difference collaboration between business and technology can make. It is clear that regulation has been a driver of technology adoption in this area and that FIX Trading Community will continue to play a key role in helping the industry achieve these goals.

As we go into the second half of the year, we will all be looking forward at the trading landscape for 2019 and beyond. We know that the London Regional Event on the 27thJune will be an opportunity for members to engage and collaborate on core industry challenges. I, for one, look forward to meeting you there and hearing about the issues you think are key to moving the industry forward.

Scott Atwell_1

[divider_to_top]

Buyer’s Guide: Holistic Surveillance Solutions

Buyer’s Guide: Holistic Surveillance Solutions

GreySpark Partners presents a report reviewing changes to banks’ surveillance needs, the drivers of those changes, and the sellside trade and order surveillance solutions of four third-party technology vendors. The solutions surveyed for this report are:

  • bnext’s CMC eSuite;
  • Behavox’s Behavox Compliance;
  • eflow’s TZ;
  • Fonetic’s Fonetic Trade Comms Suite.

Buyer’s Guide: Holistic Surveillance Solutions

ESG: Can Hedge Funds Save Our Oceans?

By Philippe Burke, Portfolio Manager, Apache Capital

phillippe-burkeThere are several ways that large asset managers could demand a change in behaviour from publically traded polluters.

We’ve all read alarming reports of collapsing fish populations, giant rotating ocean gyres filled with consumer plastic, ocean acidification, blanching coral reefs, and melting glaciers.  Our disappearing marine life is a classic illustration of the Tragedy of Commons, and our inability to self-correct is sometimes explained as the natural outcome of an intractable Prisoner’s Dilemma. An illustration follows.

Dilemma

Suppose two rational neighbours live by a pond where a thriving population of 100 fish frolic, growing at a net rate of 10% per month.  The two neighbors meet and agree to protect & maintain the pond’s stock of 100 fish, and share in the growth bounty equally, with each harvesting only 5 fish from the pond per month.  The problem with the agreement is that it would in fact be rational for each neighbor to cheat.  To see that, consider the possible actions of each neighbor: 1) if neighbor A decides to cheat (catches 6 fish instead of 5) and believes that B will not cheat (will catch only 5 fish), A gets 100% of the benefit of cheating, and the communal loss (i.e. diminished fish stock, that could be countered by harvesting a bit less next month) is born equally by A and B; similarly 2) if A thinks B will in fact cheat (catch 6 fish), then it is again in A’s interest to cheat (catch 6 fish), because not cheating would mean that A would bear ½ the cost of B cheating, with none of the upfront benefits in extra fish.

That cost/benefit assessment is of course the same from B’s perspective.  So rather than both neighbours not cheating, which would in fact be in both A’s and B’s best long term interest (e.g. being able to harvest 5 fish per month for ever), it is short-term rational for both neighbors to cheat, and that of course is how we end up with fish populations in free-fall in oceans across the globe.

But the environmental problem is a bit more complex than this simple example suggests: in addition to rapidly falling fish populations, what fish stock remains is becoming increasingly toxic, from rising ocean pollution.  Most ocean pollution comes from the land (e.g. fertilizers, pesticides, mine tailings, plastics); The table below summarizes statistics for the Pacific coastal regions, for illustration.  In short, the Pacific accounts for roughly 50% of global ocean waters, and 99% of commercial fishing is done within 200 miles of coast lines, and within 500 meters of the surface.

In Table 1, we calculate the area of the “donut” of Pacific Ocean coastal waters where most of the fishing is done.  This area also corresponds to where most of the ocean trash is dumped annually.  This enables us to also compute the concentration of trash in coastal ocean waters.  Assuming that toxins account for 2% of trash, and that fish have a concentration of trash-emitted toxins 50 times higher than ambient waters, we find that on average, large fish accumulate toxic concentration levels in their flesh in the range of the EPA’s maximum recommended threshold within two years of life in Pacific coastal waters.

table-1-ocean-pollution

How can this problem be addressed?  Consider the parties that must reach a lasting agreement for this pollution and depletion problem to be reversed:  it is no longer an agreement among fishermen to harvest responsibly, but also among nations to hold pollution in check. Our lake is now the Pacific Ocean, and our two neighbors are now China and the US.  If one neighbor nation grows more rapidly, that nation will likely also increase its share of ocean pollution, impairing the resources and health of both neighbors, and both parties will have an incentive to harvest fish more quickly before the rapidly diminishing fish stock becomes even more toxic. So any agreement between neighbors must include both harvesting and pollution limitations, but as was the case in our simple lake example above, it may be short-term rational for both/either neighbor to cheat on any agreed quotas.

A closer look

To address this problem, let’s begin by modeling the dynamics between the human and fish populations.  At its simplest, we have rising human population leading to an increase in GDP, an associated rise in industrial pollution, as well as greater harvesting of fish for food.  Rising pollution and diminishing fish stock in turn causes higher concentration of pollution in fish populations, resulting in greater food-toxicity and deaths for humans. In sum, humans take fish out of the ocean while adding pollution.  What outcome can we expect for human and fish populations over time?  We have a number of modeling alternatives, including the Schaefer harvesting model and the Lotka-Voltera predator-prey model.  In what follows, we will employ a modified set of Lotka-Voltera logistic equations and examine these dynamics over time.

table-2

Future Proofing Trade Execution Systems

A GlobalTrading Roundtable Discussion.

in2_2018q1

Regulation, technology advances and behavioural changes mean that trade automation will increase, and systems and their providers need to be flexible, agree speakers at a GlobalTrading roundtable. But, the degree of standardisation and the merits of outsourcing are controversial.

The technology and business functions within buy- and sell-side firms can no longer be siloed; they are interdependent. A trading desk can only be successful if it incorporates and prioritises technology that, if applied properly, will ensure a firm’s survival, agreed speakers at a GlobalTrading thought-leadership roundtable sponsored by Itiviti and hosted by the LSE on 22 November 2017.

Regulatory headwinds are an important catalyst for technology improvements that are creating new data sets and the opportunities to use them. The proliferation and fragmentation of liquidity and how to use it is a key issue for dealing desks.

For instance, the restrictions on dark pools imposed by the Markets in Financial Instruments Directive (MiFID) II will propel the automation of block trading to find alternative sources of liquidity and firms will need a cost-effective way to do this – which means making technology core to their businesses. Without the adoption of technology in a central role, firms will struggle to compete.

Furthermore, there is a danger that firms might re-invent the wheel for different asset classes, so it makes sense to move towards multi-asset platforms and integrated operations. Avoiding duplication reduces costs.

But, too often in the past, technology has been applied like sticky plaster, patching up gaps in systems and salving deficiencies. Instead, future proofing technology needs a clear sighted integration of the business strategy with the technology architecture. It might be expensive for smaller firms to implement, and consolidation within the industry might be an inevitable consequence. Clearly, those firms that have already invested in technology as a key component of their business strategies are at an advantage.

Most topically, if a firm implements MiFID II correctly, then it should be well-positioned to adjust to new technologies and future regulations. This means applying the requirements of the legislation diligently across asset classes to ensure accurate transaction cost analysis and best execution. As data is fundamental to achieving best execution, greater automation is inevitable in order to tap into new sources of information and optimise processing. The scalability of data collection is essential.

Some asset managers have already unequivocally embraced an automated trading process. This includes integrating platforms and aligning data processing facilities in a systematic way with the objective of enhancing access to liquidity and maximising operational efficiency. The result, within just a few years, will be almost no human involvement in the process which instead will be managed by artificial intelligence and machine learning, according to a speaker.

However, even if this were an established trend, inertia would likely prevent its unanimous adoption within the industry, noted another speaker.

Of course, there is a danger that the perspectives of business strategists and technical experts gets lost in translation. Their disciplines have different languages, so perfect integration and mutual-understanding is perhaps an ideal benchmark rather than an attainable target.

Standardisation and customisation
Some speakers argued that firms should build internally the systems that give them a competitive advantage, such as data analytics, trade algorithms and content distribution, but buy – or more accurately rent – systems for their more basic activities. Another way of expressing the model is “to customise round the edges but install a core provided by vendor”. Basically, the firm has a standardised workflow system surrounded by smart algorithms.

However, the distinction between what should be proprietary and what is generic is not always clear. For example, some firms, especially if they are large and complex, might build their order management system in-house, others might conclude that it can be standardised.  Vendors need to make various predictions about the direction of automation, and there can be logistical and cost problems trying to align those bets.

robert_jonas

A significant insight that arose from the discussion about the buy/build or hybrid models was recognition that the “buy” option is now more often a “rent” option. Firms increasingly hire systems rather than own them, which allows them to be more flexible to advances in technology and the imposition of future, unknown regulation. This characteristic is likely to predominate among the younger generation entering the financial industry, who are more comfortable with the concept of renting and sharing, but still causes nervousness among veteran industry participants uncomfortable without fixed, secure data centres and copyright-protected trading systems.

The Transparency Evolution

By Michael Beth, Vice President, Equity & Derivative Trading, WallachBeth Capital

michael-bethRegulatory changes have created the need for increased execution transparency.

Execution transparency has been a common preoccupation for most trading desks in recent years. Whether it is in order handling, exchange fees or transaction cost analysis (TCA) there is always information perceived to be missing or unclear.

Technology is one answer to the lack of transparency between the sell-side and the buy-side and their clients. With the advent of the FIX protocol, exchanges have been able to send specific information on where and how an order is was executed, while order audit trail system (OATS) reporting sends detailed information on how orders are handled once received. The sell-side, in particular, has spent a lot of resources and time working on this.

However, although proven useful, these methods are “old” technology and new ways to disseminate data are being developed. The major drivers of these advancements have come from both regulation and commercial necessity. Of course, the most recent regulatory initiative to improve transparency is the markets in financial instruments directive (MiFID) II, devised by the European Securities and Markets Authority.  MiFID II introduces guidelines on how firms receive pricing, report transactions, quantify best execution, and pay for research. Estimates for the overall cost of this initiative reach well into the billions of dollars. This initiative has forced vendors to create new systems, as well as enhance existing platforms to enhance trade transparency.

In the United States, a similar trend is emerging with Consolidated Audit Trail (CAT) reporting, and the first group of exchange members are expected to start sending data from November 2018. The amount of investment in technology required for compliance with CAT is also likely to be similar to the level of resources allocated to meeting the requirements of MiFID II.  For the first time, both Financial Industry Regulatory Authority members and other market participants will be obliged to report audit trail information across multiple asset classes.

Both MiFID II and CAT have generated a dire need for smarter technology and more trade transparency.  Old T+1 TCA metrics and execution data are no longer as relevant to traders who are increasingly concerned about how they are performing, rather than how they performed. The market has moved to the idea of “T+Now”.

Real-time data
True innovators will be able to supply required data in real-time, and consumers of this data will be able to adjust their trading immediately to meet their best execution requirements. Moreover, those who are most successful at creating new technologies will incorporate both artificial intelligence and machine learning. Deploying these techniques allows an application the capability of enhancing itself over time. At WallachBeth, we have created a T+Now portal that not only giving users access to TCA in real-time, but also incorporates machine learning techniques into a proprietary pre-trade model.

Older systems will have trouble adjusting. Monolithic architecture and physical servers are not easy to maintain. That is why more technologists are offering cloud computing and micro services frameworks. Both of these allow creators of applications to be agile and better able to adjust their capacity, as well as build and update individual pieces of code.

The future is bright for firms who take this evolution in their stride. The demand is here, and all that is needed is for the first few innovators to take a leap of faith.

We’d love to hear your feedback on this article. Please click here

globaltrading-logo-002

Upgrading Post-Trade Operations

David Pearson 14By David Pearson, Head of Post-Trade, Fidessa

Already well-established for cash equities, automated trade matching, reconciliation and confirmation systems are finally extending to other asset classes.

Post-trade operations, at both buy- and sell-side firms, have lacked the investment in technology that the front office has enjoyed. Instead, it has suffered from a piecemeal approach to technology, characterised by a plethora of tactical solutions for different asset classes, especially in the middle office trade confirmation functions.

There are two deleterious consequences: First, there is uncertainty at the close of the trading day about the status of transactions; second, there is a susceptibility to errors caused by manual matching which still dominates post-trade operations for most asset classes.

These problems are exacerbated for exchange traded derivatives by the additional complexity of having three parties involved in many trades. The lack of electronic post-trade workflow between the buy-side and sell-side means that the allocation process for the broker is based on input from the trader receiving voice instructions, which is vulnerable to miscommunication.

Many buy-side firms totally rely on manual processes to reconcile and match trade details with the account statements from their brokers. Errors are detected late, and sometimes not fixed until the following day. Unmatched trades also mean that the sell-side counterparty may have to margin an open position for its client, and nor is the client sure that the risk hedge it wants is in place.

An alternative approach is to meet the problem head on and provide a post-trade functional workflow that brings all the parties together to agree the trade economics of each account allocation. However, it is expensive to introduce internally. A more viable option is to outsource the post-trade functions to a third party which benefits from economies of scale through its provision of the service to other firms.

Automated process
Fidessa offers the industry a seamless electronic and automated capability to communicate and match trade details in exchange traded derivatives, driven by an electronic stream of trade allocations direct from the buy-side.

Its Affirmation Management System (AMS), founded on industry collaboration, connects and assimilates workflows in a standardised way to support trade confirmation, reconciliation and matching.

Firms that adopt the AMS have surety of their trading positions on the actual trade date. This lowers the level of operational risk associated with holding unconfirmed positions, and also reduces the number of errors that can be introduced through manual processing. It is a business utility service-based model, which can be easily incorporated into buy- and sell-side firms’ trading infrastructure using the FIX protocol.

The AMS extends existing front office workflows into the post-trade operation, enabling buy-side firms to use the data captured at the point of order execution. The settlement phase is faster with seamless workflows integrated between front and back offices, and errors are immediately flagged in a configurable format. Moreover, the direct electronic workflow with each broker eliminates the need for a central matching facility.

It is already well-established for post-trade equities, and has recently been introduced for exchange traded derivatives. Eventually, the service will be applied to over-the-counter and repo transactions where, like exchange traded derivatives, there is little or no automated post-trade reconciliation.

Clients send allocation instructions to Fidessa who match their instructions with their brokers, who are integrated into the service for real time processing. Both counterparties will have their trades confirmed and reconciled immediately.

In addition, the AMS is flexible enough to integrate with other third-party systems. The objective is to raise the technological standards of post-trade operations to those that have already been reached at other stages in the trade cycle. There is plenty of room for other systems, but it makes sense for them to be compatible to best serve the interests of clients.

We’d love to hear your feedback on this article. Please click here

globaltrading-logo-002

The Case For Outsourcing

carl-slesserBy Carl Slesser, Head of Sell-Side Product Management, Market Technology, Nasdaq

Brokers are turning to third parties for support in running their trading operations.

Regulations to promote competition in trading have led to a proliferation of liquidity pools over the last decade or so – many of which are run by large brokers and investment banks. But, these organisations have found that running a trading venue is getting more difficult.

Regulations are changing rapidly, and oversight has become much stricter. Firms need constantly to respond to new demands from regulatory bodies while differentiating to retain a competitive edge. With margins continually squeezed, many sell-side firms realise that their efforts could be better spent on enhancing core competencies aimed at generating revenue rather than on keeping the operations of these venues within regulatory parameters.

Complexity breeds an opportunity for outsourcing. To this end, the market technology division at Nasdaq, which has traditionally served the global market infrastructure operator community as a technology partner, has experienced a huge uptick in interest in outsourcing from sell-side firms. Specifically, they are considering having a trusted third party perform certain trading and operational functions and related services.

Some firms have ambitions to unify their multi-asset trading on a single platform and have us operate it for them, while others want us to help them meet their Markets in Financial Instruments Directive (MiFID) II obligations as a newly declared systematic internaliser (SI) or multilateral trading facility (MTF). In one case in the US, the firm wants us to take on the complete technology and operations of its entire dark pool. As the market structure changes, we see a wider array of use cases for outsourcing, but all with the same underlying need: to re-focus scarce resources on revenue-generating projects and focus on differentiation.

Outsourcing is not new in our space. Not surprisingly, the key drivers are regulatory complexity and cost with the demand for resources, technology evolution and data complexity woven in.

Regulatory complexity
The stiff fines and penalties regulators have levied against brokers for rule violations on their trading venues have been a wakeup call for the industry. In 2016, a major global bank was fined for misleading customers about its equity dark pool platform pertaining to an issue that involved a failure to address known technical problems with its proprietary dark pool ranking model. With various global regulatory initiatives around the corner, these types of sanctions will become more common.

MiFID II provides us one example where regulatory compliance will get even tougher, as the mandate ushers in a new market structure regime comprising regulated exchanges, MTFs, organized trading facilities (OTFs) and SIs.  Ultimately, MiFID II promotes greater transparency and discourages trading in the dark. Broker crossing networks will be eliminated, and firms that internalise trades will have new obligations. As a result, brokers are currently reassessing the way they will service their clients and also the processes, procedures and technology prowess required to meet these obligations.

Because the regulation applies to far more asset classes than its predecessor, MiFID I, firms need to take a hard look at a more holistic approach to their technology and operations. Furthermore, the transaction reporting requirements on these firms are onerous, and pre-trade transparency and surveillance monitoring requirements will undoubtedly bring more technological and resource challenges to organisations that claim new status as either an SI, MTF or OTF.

Cost pressures
Overall, the regulatory compliance burden has increased – whether in Europe in relation to MiFID II, or other regions where regulators are cracking down on dark pools and other internalised order flow. This has put enormous pressure on brokers’ balance sheets, and they need to search for areas where costs can be reduced to achieve capital efficiency. These new demands cannot be executed on aging technology. Firms must either spend the money to refresh it internally, or they must outsource and partner with a trusted third party that can deliver and potentially even run the solution.

When comparing the costs of keeping pace with functional developments, client demands and required regulatory changes, outsourcing may provide an attractive alternative. Working with a technology partner that understands the operational needs of a marketplace can enable brokers and banks to concentrate resources on developing and monetizing their intellectual property, such as their suite of algorithms, instead of developing and operating trading solutions.

Building a case
Outsourcing can be an attractive alternative for firms that are looking for creative ways to better leverage scarce cross-departmental resources and to boost revenues among a complicated, extremely competitive landscape. Outsourcing mission-critical operations to a vendor can be daunting and discomforting, so the choice of partner(s) is critical. Sell-side firms looking to embark on this journey must consider the expertise of their partner, including its technology prowess and its ability to reliably execute on the firm’s behalf, not just tomorrow, but far into the future.

We’d love to hear your feedback on this article. Please click here

globaltrading-logo-002

Adaptation to the General Data Protection Regulation

Adaptation to the General Data Protection Regulation

This report narrates the approach that the Chief Data Officers within investment banks could take in managing the organisation’s implementation and compliance efforts associated with the EU’s General Data Protection Regulation.

Adaptation to the General Data Protection Regulation

Bringing Finance Back To The Forefront Of Innovation

By Kenneth McLeish, Global Head of Equities Trading Technology, J.P. Morgan Asset Management

Financial firms that put technology at the core of their business, use agile development processes and leverage the cloud will gain a competitive edge.

kenneth-mcleish“I hope our feathered messengers will have brought you in due time our good prices…A B in our pigeon dispatches means: buy stock, the news is good; C D…means sell stock, the news is bad”, Nathaniel de Rothschild, Paris, August 1846

Finance firms have long been at the forefront of pioneering new technology to give them an edge.  From Baron Rothschild’s innovative use of carrier pigeons in the 1800s to improve the timeliness of information coming from the Battle of Waterloo, to hedge fund managers of the 2000s who could benefit from the millionths of a second differences between sending messages over microwaves rather than via optic fibres.

During the 1990s and 2000s technology investment in finance soared and technology talent flocked to join finance firms.  Finance was at the forefront of technological innovation and home to some of the biggest data centres, most sophisticated networks and most complex software ecosystems.

Yet, during the past decade, newer firms such as Google, Facebook and Netflix appear to have overtaken investment firms as the ones pushing the boundaries of innovation. This is a trend that delegates at last December’s Equities Leaders Summit in Miami recognized.

How did finance fall behind and what can be done to bring finance back to the forefront of innovation?

Technical debt
Technical debt is a term frequently used by technologists.  Simply put, it is avoidable complexity which, if allowed to build up, will eventually slow things down and become burdensome.  Many finance firms were good at building new systems but less efficient at removing dated ones. The result has been large and complex ecosystems composed of tens, or even hundreds, of applications that are expensive to maintain or change, slowing down a number of finance firms quite considerably.

Regulation
Increased regulatory reporting, cybersecurity and resiliency requirements have had a profound impact. Building more resilient systems has tended to require more than one data centre and the doubling of equipment (which tends to higher quality in nature to reduce the risk of component failure) while having to keep everything in sync. With the ever-growing threat to cybersecurity, maintaining a bigger estate has become an increasingly expensive exercise. Multiply this by a large number of systems and one can begin to see what’s been consuming the technology budgets of many finance firms.

Cultural change
The increasing cost of delivering technology coincided with the financial crisis of 2007-2008. Technology budgets came under pressure, with many firms struggling to do much beyond meeting regulatory requirements. As a result, many firms looked to reduce technology costs by outsourcing or offshoring which, over time, has become harder to manage from a distance. To compensate, management processes have become more rigorous with distinct phases of analysis, build, test, release and support, often done by separate teams in separate locations.

What can be done to increase innovation?
Innovative companies tend to operate in a notably different fashion to more traditional ones whereby how things are developed is as important as what is developed. Upon closer inspection, we’d argue that innovative firms tend to have three themes in common: they see technology as core to their business, they use agile development processes and they use the cloud.

Core
Within finance, business and technology were, for a very long time, typically viewed as distinct functions. They sat in different locations, had different reporting lines and were often poorly represented on operating committees and in boardrooms.  Limited comprehension of emerging issues and misaligned goals (typical by-products of such segregation) would often drive both functions even further apart.

But, as was later learned, when technology is integrated into the business, efficiency gains arise and it creates an environment that’s generally more conducive to innovation. Having technologists sit alongside their business users can, for example, lead to these efficiency gains:

If a developer is sitting beside a business user, the feedback loop tends to become instantaneous. In the event that a business user encounters problems with a particular application, having technologists on standby to ascertain and fix the issue quickly enhances the overall ecosystem.

Technologists are widely regarded as natural problem solvers. Being exposed to daily business activities, they often build a deep understanding of the business which, in turn, increases their overall effectiveness and leads to new and innovative solutions.

Agile
Traditional software development would often be run like a construction project. Phases were divided into requirement gathering, analysis, design, build, test and release.  The process was lengthy and often led to results that didn’t necessarily match expectations. But iterative and incremental development processes have grown in popularity, especially since being formalized as “agile” in the early 2000s.

Delivering a little and often enables teams to get feedback, learn and amend along the way.  With small steps and multi-skilled technologists working in close proximity to the business, layers of business analysts and project managers have often been replaced. Coupled with automated testing and deployment processes, the need for manual testers and release managers has also reduced.

A modern development approach results in a higher proportion of technologists producing code. In an agile team, more than 80% can code. That’s a doubling of producers. In traditional finance firms, one might expect to see less than half of technology teams writing code. With the efficiency gains of automation and continual feedback, the time spent producing also increases.

By way of comparison, Amazon releases new code to production every few seconds.  While this degree and pace of change may feel uncomfortable in the highly regulated world of finance, it is possible and can potentially be advantageous. In reality, a periodic release bundles a number of changes together, which increases complexity and the risk of something going wrong.

By investing in automated testing, systematic release processes and a flexible architecture, releasing becomes easier and more predictable.  When releases are easier and more predictable, changes can be introduced incrementally.  The risk of each release is reduced, impact is minimized and diagnosing the issue becomes easier.  Teams then spend less time resolving problems and more time adding value. The pace of change and ability to innovate is accelerated.

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA