Home Blog Page 560

Digital Transformation of Investment Banking: The Capital Markets’ Industrial Revolution

DigitalTransformation

Digital Transformation of Investment Banking: The Capital Markets’ Industrial Revolution

GreySpark Partners presents a report, Digital Transformation of Investment Banking, exploring the emerging industrial revolution of wholesale banking operations’ business models. It explores the influence of regulations, buyside institutions, the competitive landscape and technology innovation in driving this revolution, which will see investment banks transform into lean, efficient and reliable manufacturers and service providers.

https://research.greyspark.com/2016/digital-transformation-of-investment-banking/

MiFID II – What The Industry Can Expect Over The Next 600 Days

By Dr. Sandra Bramhoff, Senior Vice President, Cash Market Development, Deutsche Börse AG
Sandra BramhoffIf you google ‘MiFID II’ more than half a million results will be returned. The amount of information that is available seems vast but the industry still awaits the publication of some of the most important details: the final Level II measures. These include 28 RTS (Regulatory Technical Standards), 8 ITS (Implementing Technical Standards) as well as Delegated and Implementing Acts. The industry is under particular pressure regarding the modifications needed for IT systems in order to comply with the forthcoming requirements. By proposing to delay the application of MiFID II by a year European legislators have acknowledged that significant system changes need further time. Brussels is feeling the pressure: National Competent Authorities (NCA’s) need nine months to transpose the requirements into national law.
Therefore, Member States and the European Parliament are considering postponing the date for transposition into national law also by one year that is to 3 July 2017. The process also envisages a period of six months between the transposition into national law and applicability itself. In order to keep the envisaged timeline, all Level 2 measures need to be published before end of September 2016 to give market participants as well as authorities sufficient time to adapt to the new technical and legal requirements. However, one would be mistaken in believing that once the Level 2 measures are released the industry will have full clarity. The “details of the details”, which are the Level III measures in form of ESMA guidelines and Q&As, are yet to be drafted and will be important to fill the remaining gaps and to contribute to a shared understanding among NCAs of how to interpret and apply the new provisions.1 Nevertheless, it is unlikely that the final Level III measures will be published before the end of this year.
MiFID II impact
With the introduction of a trading obligation for shares, the double volume cap mechanism2 and thresholds for the Systematic Internalisers (SI)3 , OTC trading in shares will almost vanish. OTC trading will only be allowed if trading is non-systematic, ad-hoc, irregular and infrequent. So where will those that traded OTC trade in future? Will the lit markets (Regulated Markets and MTFs) benefit or will the Systematic Internaliser (SI) regime experience a revival? While Regulated Markets and MTFs will have to follow a new tick size regime (which overall implies very low spread-to-tick-ratios) SIs are exempt. Hence SIs can execute all orders of a specific client at any price better than its publicly disclosed quotes. In case matched principal trading would be allowed under the roof of the SI, clarification on this is expected with the forthcoming Delegated Acts, current broker crossing networks would be able to continue (if required) to interpose between buyer and seller in such a way that they are never exposed to market risk.
Furthermore, although the quote display requirements will increase, there will only be a small quote size, i.e. for a very liquid share it would only be 10% of standard market size which is easy to fulfil. While the industry still awaits clarity with regards to final SI thresholds4 it remains unclear how the internal matching platform fits into the world of MiFID II. The Level I text states that investment firms that operate an internal matching platform need to authorise as a MTF.
So if the SI thresholds are not met broker crossing networks will be able to continue trading OTC as long as their trading is non-systematic, ad-hoc, irregular and infrequent and if they are not running an internal matching platform. If the Delegated Acts will provide more clarity of what an internal matching system actually is, remains open as of today.
While in future investment firms will have to trade shares either on Regulated Markets, MTFs or SIs, the trading obligation does not apply to fixed income. However under MiFID II an additional trading category, Organised Trading Facility (OTF), will be available. The OTF is meant to capture all types of organised execution which is currently not provided by existing venues. Both organisational as well as transparency requirements will apply to ensure efficient price discovery. In comparison to Regulated Markets and MTFs the OTF category provides discretion over how orders are placed and executed. While it also allows to some extent dealing on own account (other than matched principle)5 it is strictly forbidden that the same legal entity that operates an OTF operates a SI. Like for equities investment firms that deal on own account by executing client orders outside a trading venue have to register as a SI. With regards to the new bond transparency rules the industry might actually see a phase-in approach for the disclosure requirements of liquid, non-liquid and size specific (SSTI) instruments that were originally proposed by ESMA. ESMA has now until mid-May to revise the draft standards that were published at the end of September 2015.
The rules on open access might also change the current post-trading landscape. Trading venues will in future be required to provide access to those CCPs that wish to clear transactions on those trading venues. Liquidity might be directed to the trading venue with the most attractive Post-Trade offering. Competition will increase on all layers of the value chain, for trading, clearing and settlement.
The introduction of market maker agreements and market maker schemes will considerably change the concept of liquidity provision. While today liquidity provision takes place on a voluntary basis, investment firms that are engaged in algorithmic trading and pursue a market making strategy6 will in future be required to enter into an agreement and to continue making markets during stressed market conditions. Venues will be obliged to provide incentives during those stressed market conditions.7 Only in times of exceptional circumstances such as extreme volatility market makers will not be obliged to provide liquidity. Today trading venues generally stop the performance measurement during these times.
Although most venues in Europe already apply order-to-trade ratios (OTR) the introduction of not only a number based but also a volume based order-to-trade-ratio will be new to most. Trading firms will be required to carefully adjust their strategies if they want to avoid breaching them, as a single breach per day would mean a violation of the OTR regime.
Investment firms as well as trading venues will be required to maintain extensive records of data, especially those ones that operate a high frequency algorithmic trading technique will be impacted. The industry needs to prepare to store terabytes of data!
The requirements to report transactions to the competent authority will be widened by including those financial instruments that are admitted to trading or traded on a trading venue (i.e. Regulated Markets, MTFs, and OTFs) or where the underlying is a financial instrument, basket or index traded on a trading venue. Investment firms will also be obliged to include a wider range of data fields in those reports. Competition will increase with regards to service offerings of reporting solutions through Approved Reporting Mechanisms (ARM).
The list of additional requirements, such as synchronisation of business clocks, license requirements for high frequency trading firms, revised best execution rules, just to mention a few more, is long. Some of these have higher impact than others but eventually will require a considerable time for implementation.
In a nutshell
With more than 1000 pages of new and revised rules the industry has a lot to absorb over the next 600 days: the impact on market structure is considerable. Although the draft RTS and ITS are available and the Delegated Acts are expected to be released in a step-wise approach this quarter, the industry needs clarity in terms of final rules. The details of the details are also not yet expected before the end of this year and not to forget implementation of MiFID II into national law.
We’d love to hear your feedback on this article. Please click here
P_55 2016 Q2
globaltrading-logo-002

Driving Latency Monitoring Across The Marketplace

With David Snowdon, Founder, CTO, Metamako
David SnowdonA hundred microseconds has become the target accuracy for trading systems’ trading reports – after much debate that is what the regulators have settled on. However modern markets operate at a much faster pace, and for the regulators to be able to understand the ordering of events the accuracy of those timestamps must be much more accurate. They want to be able to reconstruct what happened in the market with confidence, but the current greater requirement of 100 microseconds is just too broad to be able to do so.
Without becoming too technical, a system using 10G Ethernet – the predominant standard for communication in modern markets – gets 64 bits of data every 6.4 nanoseconds – 6.4 billionths of second. It’s a fundamental part of the way that communication occurs. That means that Every 6.4 ns a block of data is transferred – part of a message. Within that 6.4 ns, the timing is much less significant. Effectively, two packets which begin to be transmitted during the same 6.4 ns period will appear as simultaneous to the system which is receiving them. It’s a position which is arguable from the fundamentals of the way in which computers communicate.
That’s about 10,000 times more accurate than MiFID II’s hundred microsecond requirement. In a hundred microseconds you can easily get an entire order to the exchange, and on some exchanges it’s possible to execute an entire order, get the response back, and place a second order and both orders would be considered to be simultaneous within the regulations!
While the present requirements are not sufficient for the stated purpose, the required accuracy can be adjusted. In some sense, this is only the beginning…
As time goes on and firms become more comfortable with the technology, the nirvana of traceability can be achieved. While this won’t achieve the goal in the near term, this is a huge step forward.
Competing with HFT
There’s been a lot of debate surrounding market makers and high frequency trading generally. To compete in the modern world firms need to have good information and analytics to base their optimisation decisions on. Can you imagine a floor trader relaxing with a coffee during a market crash? It might be comfortable, but it’s hardly the way to succeed. A firm without network visibility is a firm which can’t understand its own behaviour, since the boundary of the network is the definitive point at which the trade can no longer be affected.
Firms have a responsibility to their customers to get it right, and the requirement of accurate time reporting is a chance for these to have a solid understanding of what works and what doesn’t.
Behaviour versus cost
The technology is not the issue for the majority of firms. If a house is looking to be absolutely at the cutting edge and they’re only relying on latency to get an advantage, then, yes, they need to put a lot of money into the technology. But on a basic level, the technology that’s required to be competitive is not expensive.
The HFTs are the disruptors of the trading world. They are the Ubers, Googles and Facebooks. They are using technology to do things better than the slow firms. Their company culture is agile and they are technically aggressive. Their employees are highly motivated to perform and more… they’re willing to take technical risks.
Driving the change
The current regulatory drive is bringing significant attention to the issue of accuracy of network monitoring and time stamping, and if that drives firms down to the microsecond or nanosecond accuracy level, which is achievable without a massive investment, then it’s a very positive step. Forward looking firms are looking at MiFID II not as a burden, to meet the letter of the regulation, but an opportunity to push the technology internally and get a number of other benefits from the data. These firms are going much further than what’s actually required to meet the specification.
More than that, firms are not just looking at Europe. They’re looking worldwide. Again, this is an opportunity to understand what’s going on and apply that knowledge to improve the process.
Future latency
The speed of light is constant, given distance and medium. The debate though is who hits the end of that fibre first, which is where you get to start trading. You can have a substantial response time from the exchange but still care about how consistent your own internal switches and trading systems are.
Once firms get their heads around the problem of basic latency monitoring and consistency, they will start to push further than the regulation requires. Without measurement, it’s impossible to implement effective improvements.
The regulators have done well to bootstrap the process, but if we want to understand who did what, when, and in what order, then this is just the beginning. We’ve got four orders of magnitude to go.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

Unifying Communications

With Michal Speranza, Senior Vice President Corporate Strategy and Marketing, IPC
Michael SperanzaOver the last two years, firms in the US have seen an expansion in the number of users that are falling under the umbrella of regulation, where users that either didn’t need to be monitored, regulated, controlled, or secured are now being cast into the same oversight as a typical front office trader in jurisdictions like Europe and Asia.
IPC’s focus is making sure that we address the entire ecosystem of regulated users as it relates to areas such as compliance, communications, securing those communications, allowing them to exchange information with their counterparties securely and then mitigating their risk. A lot of the transition that we’ve gone through as an organisation is to make sure that we continue to evolve to meet those needs of the clients.
Communications is one of the more challenging elements to solve from a regulatory perspective because a lot of solutions currently deployed were built to address dispute resolution rather than regulatory adherence.  Many current solutions and technologies are not sufficiently integrated to the communications infrastructure and struggle to cater to the day 2 needs of correlating interactions with transactions or analyzing the content for surveillance purposes.
The ongoing trend is therefore to try and harmonise the solutions as much as possible because it makes complying with inquiries and regulators, securing the business, matching communications with actual transactions much easier. We’re working to unify communications associated with capital markets activity as much as possible and provide end-to-end oversight so a firm can work with a single provider that can address the whole sphere of challenges that they’re facing from a compliance and a regulatory standpoint.
A major effort that is ongoing is to unify the technology stack and consolidate the different media types. With communications varying in form such as email, instant messaging, voice communications, chat rooms; many customers are trying to unify that as much as possible to aid the audit control and inquiries that are required to comply with the regulators.
Making money from data
Looking at how data compliance use has evolved over the last few years, when firms had to comply in the US for example following Dodd Frank, they simply adopted what technology was available. There was an element to which compliance came first, and time was a principal pressure. They were not stepping back, surveying the landscape of technology and designing a solution from the ground up. A lot of those solutions are challenging, cumbersome and have high operating costs when it comes to the overall use, maintenance and ensuring that they stay operable.
Now, firms are doing two things. One, firms are increasingly looking at this area and evaluating how they make the technology not just a liability but a competitive advantage; having a more efficient operating model to comply with the regulations, and then using that to boost by return on equity and lower cost bases.
Secondly, firms now have a vast amount of data, and they are starting to look at data as an asset instead of a liability, and they’re trying to interrogate, mine and draw useful conclusions from the data that was previously unused. Some firms are starting to study successful interactions, client communications, understanding what leads to a successful transaction or a profitable one and how to replicate that across certain environments. A lot of that information today is locked up and captured in unstructured formats that were designed for compliance and regulatory purposes. The new challenge is to gain access to that dataset, to use it, study it and actually turn it into a source of productivity.
The future of data
First and foremost, we’re  focused on holistic end-to-end solutions to make sure that we streamline the challenge of regulatory oversight as much as possible. Intellectual capital is definitely a challenge in this area in terms of actually locating, finding, securing, recruiting and retaining staff that understand the specific needs of the market.
There are some fascinating developments occurring with regards to improving analytics capabilities on the data sets;this is very much related to enhancing user productivity or improving surveillance.  As firms  try to mine the data, areas like the fidelity of the conversation are important. If things aren’t clear, if they’re not archived correctly, studying them for a productivity purpose or for a surveillance purpose becomes very challenging.
We obviously study the regulations as much as we can. They’re incredibly voluminous documents, but we also have the benefit of working with all of our customers. It’s something that allows us to be at the forefront of solving their challenges and they’re not shy when it comes to telling us what they would like us to do with the products and services. We share the front row seat with our clients and we very much try and partner with them to make sure that we’re investing in the right areas, it’s a collaborative approach.
The concepts we’re working on; security, policy, controlling communications, the fidelity of the audio; our clients have the pen on our roadmap when it comes to those things. We want to build things that provide value to customers and we definitely have a multi-year horizon to be able to do exactly that.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

The Shifting Sell-Side To Buy-Side Responsibility

With Neil Bond, Head Trader, Partner, Ardevora Asset Management
Neil BondOf the 25 years I spent working on the sell-side, most of it was spent on program trading and algos, which are quite intimately linked. Algos were originally developed for program trading desks but the buy-side were given access to them very soon after.
Why make the move to the buy-side? After all those years on the sell-side, I wanted a change that would be suitably challenging but where I wasn’t a complete beginner. I had been speaking to the buy-side for a very long time so I wanted to have a shot at what they were doing.
Rare switch
Despite many aspects of the role being very similar, it is quite a big transition to move from the sell-side to the buy-side. If a trader is going the other way – from the buy-side to the sell-side – they are doing so as a valued customer. Going from the sell-side to the money management side is trickier, while you may well have helped those people to trade effectively in the past, the move seems to be more difficult to negotiate.
One of the issues I faced when trying to make the switch was investment firms wanted someone who already had buy side experience. This is becoming less of a hurdle as more sell-side traders make the switch. When I did make the jump I could see how many more moving parts there were that brokers don’t get to see such as compliance checks, fair treatment of accounts, management of prime broking relationships and areas like cash and exposure management which I had to learn quite fast.
Moving to the buy-side allowed me to focus my aims and align them with the rest of the team around me, free from the conflicts of interest that arise from the silos commonly found on the sell side (where a sales trader can be stuck between a client that pays commissions and a trader working for the bank that pays his wages). In the current principles based regulatory environment good to be part of a genuinely cohesive team where the only aim is to provide the best return for our investors.
As a broker, if I didn’t find the stock, that tended to be the end of the story; you go back to the client and tell them that you can’t find the stock. If this happened on the buy-side however, there would be ramifications. It might be necessary to buy or sell something else instead which could affect your other positions – so it is far more involved than I had anticipated. Getting brokers on-board is also more difficult than I expected and it is necessary to justify reasons for building new relationships with brokers.
In addition, I had no experience of the relationship between traders and fund managers, and I needed to understand what they want from a trading desk and the best way to deliver what they need. Open communication between the two teams is crucial for success.
There are many advantages of having experienced the sell-side for so long; particularly understanding what happens to orders when they go across to the sell-side. I have a detailed knowledge of how the sell-side operates, what their options are and how difficult it is for them to operate in particular circumstances and these are all aspects which can be shared with my firm and which ultimately gives us more control.
Ardevora’s trading turnover was fairly small when I arrived, but AUM has grown fast and our blocks and programs are now comparable to much larger firms. In order to move away from simply sending orders out to brokers, Ardevora started to build up a wider range of broker relationships and increased use of electronic trading. The company began using dark pools to find liquidity and accessing the expertise of firms like Liquidnet and other dark pools to stay under the radar and have a more passive trading approach. We implemented a more analytic decision process when using risk programs, understanding when to avoid risk, measuring trades and understanding whether the choices made were the right ones. My sell-side skills have helped our trading desk become more scalable to meet the requirements of a higher AUM.
Exchange relationships
20 years ago there were no algos, no dark pools nor MTFs. There were no high frequency traders or electronic liquidity providers and no one was directly competing with the exchanges. With MiFID I the trading environment changed dramatically, but the exchanges have been around for a long time and they are good at defending their positions. With the technological revolution in the market, the exchanges are developing fast to stay ahead of the technological curve.
An exchange’s primary role is to bring entrepreneurial companies onto the market, to facilitate the provision of capital to those firms. After that, their role is to provide a stable marketplace within which to trade those equities. As long as they keep on doing those two things without abusing their monopolistic position, they’re fine. However, if they get too greedy, market forces and regulations will intervene to increase competition.
One unintended consequence of the regulation has meant a very fragmented market has developed, but the exchanges are responding with developments such as Turquoise BDS, midday auctions and hidden pegged order types to draw liquidity back to the primary exchange.
Transparency
The proliferation of low touch offerings has resulted in cost savings; however the buy-side having a higher level of control means an increased level of responsibility. The market has become a great deal more sophisticated in so far as the end users of the algos need to understand what is happening with their order once it goes into an algorithm, the venues that it is going to and all the different order types that those algos could be using. There is a vast level of execution data generated that needs to be gathered, analysed and monitored.
Buy-side traders should have a high level of knowledge about what is happening to their orders. The regulators asked the industry who is responsible for best execution and, unsurprisingly, the buy-side said the sell-side did and vice versa! So the regulator has decided that both buy and sell-sides need to take increased responsibility for best execution and trade management. As a result, the buy-side is starting to adopt many of the sell-side’s tools and so ultimately each side should be held to the same level of accountability as the other.
One key difference between buy and sell-side is the access to liquidity venues available to each side. The buy-side can access many more liquidity pools than one broker can alone, because not all brokers share their pools. On the flip side, the sell-side has access to a great many buy-side firms for sales trading. They have access to what my competitors might be doing. As a result of that ‘information asymmetry’ there will always be a need for brokers.
The future for the buy-side
In the future, being able to demonstrate best execution – not simply having a best execution policy – will weigh heavily on both sides. The market will continue to evolve rapidly and there is now a two-way pull due to the new venues, new order types and new systems that are being created very quickly as solutions to the problems the industry is facing but it takes time to get these new projects to critical mass as significant contributions in terms of resources are needed from within the industry.
One curious by-product of the current evolution is that while the buy-side has to go through the sell-side to access exchanges, their direct relationship with exchanges is growing stronger. It will be interesting to see how these direct relationships develop especially if investment banks are excluded from collecting research payments from trading commission.
MiFID II is going to require that firms demonstrate best execution, which means they not only have to do the right job, they also have to gather all the data and analyse it; which is neither cheap nor quick. That requirement will impact far more on the smaller firms than it will on the larger firms and raises the barrier to entry quite high.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

Getting a Good Trade: Quality Assurance Testing For Brokers In The Modern Regulatory Environment

By Daniel Pense, a 20 year industry veteran of Electronic Trading Technology, having held various roles in vendor and broker dealer firms including Credit Suisse, UBS, Donaldson Lufkin and Jenrette, and Morgan Stanley.
Daniel PenseSuppose that your marketing pitch from your trading desk to their institutional customers is successful, and they agree to try out your new algorithmic offering and make it part of their trading strategy in the next quarter. Now that you have the commitment for the trade, it is the job of the bank’s IT department to make sure that once the order gets in the door, it passes cleanly through the chain of systems that will place it at market, successfully manages the life cycle of the order, and returns every order execution intact with the detail requisite to satisfy the client. With the increasing complexity and nuance of how markets allow orders to approach liquidity, and the lengthening chain of systems that enable them to engage in this dance, successfully completing a trade has become more complex than ever before. When the scrutiny and governance detail of increased regulation comes to bear, the requirement for precision in managing that complexity drives the difficulty to a new level. It has been said that mastery of technology is the difference between success and failure for a trading firm, but it is not mere skill at the trading process, but also the management of technical complexity that comes into play. All the good ideas and implementation for the latest algorithmic product offering may get front page press, but change management, design that accommodates instrumentation and tooling, and yes, plain old fashioned good QA testing make all the difference in how a firm will do over time.
Customisability
Broker dealers deal with a set of problems specific to their niche. Generally it is expected that they will provide accommodative technical solutions for their customers that allow the Institutional side to interact with their set of servicers as uniformly as possible. Of course each institution will have requirements on all servicers that vary from institution to institution. Markets also may impose some of these same requirements for uniformity, but of course each of them also has their own flavours of trading requirements and offerings. This means that broker dealers must build systems that have a high degree of flexibility and customisability at least at the boundaries, and they need to be able to effectively manage the configuration complexity that comes with that flexibility. This typically requires customised tooling.
As systems evolve over time, there needs to be an effective way to test not only the base scenarios, but also how the systems perform with various customisations applied. In addition, the test process must include the instrumentation and tooling to assure that these remain functional with the deployment of each new build. When this requirement is multiplied across the various systems that are used by each trading desk the task of just maintaining system integrity is very daunting.
Put another way, good QA testing should always include both functional and non-functional specifications, and when the latter is neglected in favour of the former, plant manageability is compromised. When an outage occurs it will find the weakest link, and from a business relationship perspective, it may not matter if the problem is with TCA of the algorithm or simply the amount of time it takes to get back an execution from a broken trade. Anything that hurts the customer’s trading experience has the potential to impact their cost and can impact the relationship. Technology is always perceived to be only as good as the relationship that it facilitates. This is as much dependent upon consistency and manageability as it is on the latest trading fad.
Firms that consciously consolidate portions of their technology do better at managing the consistency of their delivery, and simplify the task of the quality assurance process. By centralizing certain aspects of the technology: having common data stores across desks or even asset classes, maintaining a tight internal data model (even if this is mostly centred on a common internal FIX spec), enforcing paradigms for resiliency and redundancy across the plant, firms can re-use the same QA techniques and even components.
Changing regulatory regime
The ongoing intensification of the regulatory regime poses additional challenges for quality assurance. In many cases these regulations do not mandate specific technical implementations, as the systems that manage trading are too diverse for this level of detail in the regulation itself. Broker dealers are left to interpret the regulation and translate it into specific technical implementations in their systems. Where there is a level of interpretation, a conservative approach is usually taken, because no one wants to develop a paradigm that will later lead to a failure in an audit by regulators. But once the specific technical methodology of addressing the regulation is decided, the QA regime must expand to include use cases that will exercise those specific controls.
A case in point here would be the Market Access rule, instituted a few years back, where the specifics of the risk control procedures are left to the individual bank. Each must determine how to create controls around and within their systems to meet the requirement.
The technical implementations that I have seen addressing this are varied, but all involve counterparty limit management and some supervisory override ability. In terms of QA testing the approach here would be to add cases to the functional testing which would breach limits and trigger supervisory review.
In some cases while the regulation is more specific in the trading behaviour that it enforces, it leaves the implementation details to the broker dealer. For example Limit Up/Down controls have specific price limit percentage bands boundaries to adhere to. The mathematics is set, but each system must manage these within the constraints of how it manages market data. Again this typically will be addressed by additional use cases in the test bed, but here the test harness will need to include simulated market data and may exercise the instrumentation as well as the software under test.
In some cases the regulation is very technically specific. For example, in MiFID II there is a requirement for clock synchronisation in the general trading plant which is intended to live with the trade and transcends the systems it traverses, institution, broker or market. To implement this requirement, clock synchronization within fairly tight tolerances (100 microseconds in some cases) must not only be established in every system the trade touches, but that tolerance must be also maintained at the appropriate level. It is the latter requirement that is the most challenging as anyone who has familiarity with managing clock synchronicity in trading systems will attest.
While many tools exist to facilitate clock synchronisation, in practice keeping the drift within tight tolerances across many platforms (which may have different mechanisms for clock synchronisation) is logistically challenging. It is likely that new monitoring will need to be added across many bank’s trading plants which is specific to this function.
Quality Assurance of this kind of thing is essentially non-functional type of testing, but with a monitoring system of the kind mentioned perhaps it can be reduced to functional testing, albeit in a different domain. Possibly there are vended solutions to this problem as well, and as the regulation rolls forward perhaps more will appear, but of course at an added cost to managing the trading plant.
Standardisation
Where it is possible to implement, standardisation generally helps reduce the cost and increases the reproducibility of the QA program. Generic software test harnesses that get created within a bank often help the efficiency the QA process, sometimes to a significant degree. However their efficacy generally stops at the walls of the organisation, and since the whole purpose of electronic trading is messaging between counterparties, QA programs built around industry accepted standards are necessary also.
The FIX protocol has been the standard-bearer in this realm and is the most successful of all crossparty protocol initiatives. Cost savings realised by the industry due to the wide adoption of FIX protocol are so substantial that it would be difficult to calculate. The desire of industry participants to deliver a cutting edge offering is always there however, and this drive for the competitive edge runs counter to the spirit of standardisation. Proliferation of binary protocols and even customisations within FIX protocol itself have increased the complexity of the trading process and the technology behind it, even while improving things like trading latency and optionality within trading vehicles.
For those who wish to succeed in this environment the QA process and tooling must keep pace. Generally broker dealers implement a mix of vended and proprietary solutions to manage their systems, and those who build or buy the strongest offerings in this space reap the rewards of that investment. The challenge of increasing regulation and requirements for system complexity are not likely to reduce anytime in the near future, but attention to designing and maintaining a good QA profile across the technology plant is a focus that will improve cost efficiency and competiveness in the long term.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

The FIX Protocol in a Blockchain World

Ron Quaranta, Chairman of the Wall Street Blockchain Alliance, and Co-Chair of the FIX Trading Community’s Digital Currency/Blockchain Working Group.
Ron QuarantaFor those of us in financial services who have been involved in the world of digital currencies and distributed ledgers (also known as a blockchain), 2015 was the year that this innovative technology finally appeared on the radar of every major bank, broker-dealer, exchange and institutional investor. On a global scale, entire teams within these organisations have been created to research and analyse how blockchain technology might integrate with existing financial markets workflows. Innovation labs have been set up to spearhead the development of this disruptive capability, and hardly a day had gone by that we did not read an article in business and trade publications touting distributed ledger technology and the coming disruption to financial markets. As we enter 2016, we will likely begin to see the implementation of many of these solutions on a wider scale. Thus, the purpose of this article is to briefly define what a blockchain is, discuss its history to date, and understand how the FIX Protocol may play an integral part in blockchain technology use in global financial markets.
What is the Blockchain?
The Blockchain is essentially the underlying technology associated with the digital crypto currency known as Bitcoin. As many readers already know, Bitcoin was launched as open-source software in early 2009, by the still anonymous Satoshi Nakamoto, based upon Nakamoto’s published 2008 whitepaper “Bitcoin: A Peer-to-Peer Electronic Cash System”. The system is designed as a peer-to-peer online payments system, allowing participants to transact directly without needing an intermediary, with transactions verified by nodes on the Bitcoin network. Without diving too deeply into the more technical aspects of its operation, all transactions in the network are cryptographically recorded in a sequential chain of blocks (hence, “blockchain”), and each transaction references the block before it, thereby making illicit changes to the data extremely difficult. All nodes on the network hold a full copy of this public distributed ledger. Within this ecosystem, the blockchain unit of account is bitcoin itself. So with no central repository and no single administrator, all nodes on the network are incentivised to verify transactions, with their reward being the “mining” or creation of new bitcoin.
Bitcoin is the oldest and most widespread of the digital currencies, with a global market capitalisation of over USD6 billion as of late December 2015. However, it is not the only game in town. As of the time of this writing, there were over 650 publicly known digital currencies, many with little value, but most leveraging some form of blockchain technology.
So why is this important? Why is the concept of a blockchain relevant to financial services? To answer these questions, it is helpful to put the main characteristics of blockchain technology in focus:

  • Blockchains are meant to be data stores for sequential or chronologically ordered data.
  • Blockchains are designed to be immutable, or tamper resistant. The ability to rewrite or alter data would be so costly across a wide enough network as to be prohibitively expensive.
  • Cryptography and its associated digital signatures, prove identity of, and enforce access by, participants within the network. This also prevents the problem of “double-spending”.
  • Public blockchains are designed to be transparent and anonymous.

The Public versus Private debate
At this point, it is important to touch on a topic of significant industry debate, namely the growing discussion about public versus private blockchains. The Bitcoin Blockchain is an example of a public, fully decentralised, permission-less data store that is open to anyone wishing to participate in the network, and willing to adhere to the consensus mechanisms that enable its functionality. In addition to Bitcoin, Ethereum and Factom are well known examples of public blockchains. A private blockchain is often designed to utilise the most interesting aspects of blockchain technology, while maintaining the closed network aspects of many data systems in financial markets. As opposed to the decentralised model, the owner or administrator of a private blockchain maintains the authority to implement changes and define membership in that blockchain.
As we give thought to the many functions within the world of financial services, we start to envision how this capability might make workflows and data processing more efficient and cost effective. For example, how economically beneficial would it be for multiple banks to be leveraging a common, shared data store for a relevant business line? In a world where global financial organisations have literally hundreds of different databases, and every one of these firms has some form of “reconciliation” department, one can imagine the cost savings and growing transactional volume in a world of more efficient data processing. Currently, several firms are working on blockchain based solutions that would re-invent specific segments of financial markets processing, including syndicated loans, securities clearing and settlement and international trade finance.
None of the above as yet addresses the concerns that are being encountered within the industry about blockchain technology. How do competing firms leverage this technology while still protecting their own and their client’s data? How will regulators view industry participation? Can transparency make the burden of compliance easier? Are private blockchains simply different databases, and how can the full benefits of a distributed ledger be realised if participation is centralised? These and many other questions are already under serious consideration in the industry, and the landscape will invariably continue to develop over the coming year.
What is the role of FIX in all of this?
Up to this point, we have not really addressed the role of the FIX Protocol in the world of blockchain technology. The reason is simple. Given the dynamic state of this technology and its evolution over the past few years, the financial markets and members of FIX Trading Community are working to define not only integration points, but also potential use cases and best practices for a future blockchain world. In 2015, FIX formed the Digital Currency/Blockchain Working Group, with the mission to “…identify, analyse and define use cases and integration points for digital currency and distributed ledger technologies across the spectrum of capital markets requirements, and recommend best practices for FIX implementation and usage of this emerging technology in financial markets”.
The group has a broad membership, with participants from banks, broker dealers, blockchain start ups and more, and all with a wealth of industry experience. Within this group, members have embarked on a series of activities, including:

  • Educational seminars for Working Group members, to deepen our group understanding of blockchain technology and its implications.
  • Preliminary review and recommendations of FIX implementation for digital currency trading (with much of the FIX Protocol already quite useful for digital currency trading in the context of FX).
  • Initiated review of FIX and wider industry standards of digital currency and digital asset identifiers, as well as global marketplace identifiers.
  • Structured an initial review of securities clearing and settlement via blockchain, with a goal towards understanding impact to FIX related messages.
  • Created framework for FIX integration in broader distributed ledger initiatives, including smart contracts and alternative digital assets.
  • Initiated discussion about alternative FIX uses i.e. – embedding FIX messages within a data block, etc.

Conclusion
This article is meant to be a very basic introduction to the world of blockchain and FIX, and merely scratches the surface of this growing field and its implications for financial markets and beyond. In future articles, we will address specific use cases and well as dive into the capability at a technical and product level. FIX Members are also encouraged to join the dialogue by participating in the Community’s Digital Currency/Blockchain Working Group.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002
 

The State Of Buy-Side Technology

With Carlos Oliveira, Electronic Trading Solutions at Brandes Investment Partners, L.P.
Carlos OliveiraThere are always numerous pressures on buy-side technologists to update and improve platforms. There are demands for cost containment, there are ongoing regulatory changes to keep technology compliant, and then there are changes that are needed as a result of evolving market structure and requested functionality changes from the trading teams. Each of these is evolving in its own way but all impact the functionality and control of systems and technology.
More pressure to control costs
With high levels of volatility ringing the New Year in global markets, many buy-side firms continued to see a decrease in the value of their assets under management. If this trend continues, technology budgets, which have been tightly controlled over the last few years, should expect an even greater squeeze. In just about every firm, there is a greater need to show a very strong ROI justification before taking on new projects. Projects to meet a regulatory requirement however, will continue to have priority and easier funding.
At time of vendor contract renewal, buy-side firms are being very surgical on what they renew, and more time is been spent renegotiating contracts and licences. Everyone is being very cautious with expenditures.
Legacy and new systems
For years, buy-side firms had a preference to undertake in-house development. Whether to protect intellectual property, a real or perceived lack of suitable options, “building your own” for many was the only way to go. That model has been eroding and combined with advances in technology offerings, regulatory demands and cost containment pressure, for certain systems more and more firms are opting to buy and/or have a vendor-hosted solution, rather than build (and/or maintain) something in-house.
As long as business reasons dictate the need to keep some of the existing infrastructure, we should continue to see a hybrid model where buy-side firms maintain these legacy systems – though providing little or no enhancements – while at the same time giving existing vendors and new players an opportunity to prove themselves. Inevitably, IT departments on the buy-side will continue to get smaller, but will nevertheless continue to be an integral, essential component of the business.
Growing need for data
Small and mid-size buy-side firms rely heavily on the sell-side brokers for street connectivity, algorithmic strategies, and analytics. Extremely competitive, the sell-side firms seek to differentiate themselves by providing customisation to meet portfolio manager and trader needs.
There is a growing trend for buy-side firms to ask their brokers for more data associated with their trade executions. Trade execution data is being used to further understand and quantitatively validate that the algorithm strategy is indeed performing as advertised or requested. With a continuously evolving market structure, the buy-side is increasingly expected to know and understand what happens when an order leaves their system.
As regulation across the globe increases, we can expect some of the data may be combined with other data sources and used internally to help the buy-side meet their own compliance oversight and reporting obligations, and externally to demonstrate that appropriate choices and controls are in place.
That then feeds into the area of standardisation. It can quickly and easily become quite burdensome for the sell side to provide multiple, custom formats and fields. As these requests become more common, there is an obvious need to work out whether there is a way for the sell-side to present data and information in a standardised way. Unfortunately, there is still a great divergence as to what people want to know and so correspondingly a lack of standard will continue for a while.
Sensitivity to regulation
We always have to prepare and get ready quickly to be compliant with the most stringent of regulatory demands because that is the nature of having a global footprint. Ideally, we aim to produce something that’s more encompassing than just the minimum required, and this helps us stay ahead of the regulatory curve.
One example is last year’s SEC proposal for a set of broad rules mandating for mutual funds and Exchange Traded Funds to develop and implement a formalised liquidity risk management program, in order to reduce risks associated with redemptions under certain market conditions. While the final rules will likely change, we began assessing its impact shortly after the announcement and are looking at possible vendor solutions already.
MiFID II and the significant changes it will introduce have been on the radar for many firms. While some team members have been keeping a close eye on the discussions regarding commission unbundling, others are assessing the expanded scope of investment firm’s transaction reporting obligations. Transaction reporting will apply to a wider range of financial instruments, require additional data and impose obligations on firms that receive and transmit orders. The additional one year to comply recently granted is a welcome relief for many.
Fortunately when it comes to new regulation, the affected parties are given the opportunity to voice their concerns. For a firm of our size, more often than not, we can find vendor solutions to help with the technology, timing and resources needed to be compliant. It can be quite daunting nevertheless to get the data and the processes ready to be compliant, and then to continue with its ongoing maintenance. In order to be a global firm in this industry, keeping up with or ahead of the regulatory requirements are a significant but important cost of doing business.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

Three Desks, One Region

With Joe Rodela, Head of US Head of US Trading Trading, Allianz Global Investors
Joe RodelaUS Trading at Allianz Global Investors operates within a global trading platform – but with a regional approach. We now work out of three locations in the US. Our trading used to be solely San Francisco based but when we merged a number of boutique firms owned by Allianz, we integrated them into our global trading platform. It is still one US trading platform but it now operates out of San Diego, San Francisco and New York City.
Most equity-related trading takes place on the San Diego and San Francisco desks. Most derivatives, futures and some emerging market debt trading takes place out of New York, where we have two very highly experienced options/derivatives traders. We used to trade some derivatives out of San Francisco but two years ago transitioned that flow to New York.
The reason for having trading activities in the different locations is because we have portfolio management activities there and our experience is that there is added value in having traders operate in the same location as portfolio managers.
But I must emphasise that even though we are located in three different places, we really do function as one desk. We all use the same trade blotter so whenever a portfolio manager enters a trade, we can all see it. Similarly, we use the same turret system to our brokers, so if a broker calls the desk, the call comes into the three different locations simultaneously; any one of us can pick it up and service that call for any other trader.
We follow all the same policies and procedures and our commission management and broker voting systems are identical. Our view is that we are ‘one desk, one team’. It just happens that we are sitting in three different locations.
In order for it to work, I rotate through our San Diego office once a month, and I go to the New York office every quarter. At least once a quarter our other traders will play musical chairs as well and work from the other locations. In that way we enhance the ‘one desk, one team’ culture.
There are often times when I will be sitting on the New York desk but I’ll still be trading for the portfolio management teams in San Francisco, or I could be in San Diego doing the same thing. The way our telephone systems and trade blotters are set up means that it is unlikely that the portfolio manager will actually know which office I’m in. They can reach me just as they would any other time, – regardless of where I am sitting, they will know how to contact me. They will only know where I actually am if they physically try to come and find me. Technology has helped make this transition relatively seamless. In addition, there is an intercom system set up between the desks. We just have to hit a button and we connect between the traders in San Diego or New York instantly, and they can do the same thing back to us.
We have carefully examined the risks and rewards of this type of set up. We need to keep united as one team given that we are physically in different locations. The technology available to us and the fact that the traders are willing to rotate around the offices means it works well. Another advantage of having three desks is for business continuity and disaster recovery purposes. For example, there could be an emergency in San Francisco which might prevent us from getting into our offices. All we would need to do is put a call through to San Diego or New York in order to have them pick up trading activities for the day.
Global positioning
The way our global trading platform is set up gives us centres of expertise. It works in a similar way for the portfolio manager team regardless from where they operate. If a US-based portfolio manager wants to enter a trade in Taiwan, they put the order in the system as normal. The only thing that is different is the desk to which they route the order. For example, for Taiwan they would just highlight our Hong Kong desk which takes care of pan-Asian trading. They hit the enter button and the order will appear on our Hong Kong desk; so when the Hong Kong traders come in the next morning, they will see the order on their blotter just as we would see it here and they work it just as they do for the local Asian-based portfolio management teams in Asia.
For portfolio managers, analysts, and traders, the process is pretty seamless regardless of where they are sitting.
Strategy
If a firm is to set up a global trading platform to trade anywhere in the world there are three ways of approaching it:

  • The firm could operate out of one location which would mean the desk would require 24-hour cover, so there would be shifts of traders coming in around the clock.
  • A firm could have one location but base the hours on whichever region the desk is in. So in the US, they would have US hours and when the traders go home, they would simply hand off orders to the brokers who would hand them off to the regions. The traders would need to be on call 24 hours a day in case anything happened with the order that needed attention, regardless of the time of day or night.
  • The third option is the way we operate: having specialised trading desks in the regions so as to have experts in each market. It allows for the promotion of best practice within each specific region. We adopt the same policies and procedures on a global level, but there are also regional sub-sets of policy for each specific region.

We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

Get in line : Lynn Strongin Dodds

StandInLine-531x341

GET IN LINE.

As the EU referendum campaign accelerates, the mud-slinging is keeping pace. This was never more evident than Boris Johnson’s claims in the Sun newspaper that President Barack Obama’s “part-Kenyan” heritage and “ancestral dislike of the British empire” were the reasons behind his arguments for the UK to stay in the European Union and his decision to remove the bust of Winston Churchill from the Oval office.

Johnson has his defenders such as Iain Duncan Smith who on the BBC noted that Johnson’s comments “simply referred to some of the reasons why Obama may have a particular lack of regard for the UK,” Britain’s foreign secretary, Philip Hammond, however, put him in his place by noting in the Telegraph that, ”People who aspire to hold offices of great responsibility do have to show that even under pressure they retain their cool and they don’t step over any red lines.”

There is no doubt in my mind that Johnson more than overstepped the line. Politics is a dirty business and it is understandable that a US president wading into this controversial debate would raise hackles of the ranks of the leave campaign. However, such narrow-mindedness is surprising in a man who is known for his razor sharp intellect. Undertones or overtones of racism should not be tolerated and pointing to Obama’s roots is not a strong counter-argument especially when it has long been US policy to support British membership of the EU.

The President, himself, took the higher moral ground and did not respond except to say that he thought even Britons would understand why he might find it ”appropriate” to replace Churchill with the Rev. Dr. Martin Luther King Jr. instead.

Johnson would have been wiser to look towards Obama’s upbringing. Spending time in Indonesia as a child as well as in Hawaii with his maternal grandparents probably had a greater influence on his worldview. It helps explains why he has tried to forge closer ties with the region. This has not only culminated in the “Pivot to Asia” strategy – a cornerstone in the administration’s foreign policy – but also the efforts that have gone into making the Trans-Pacific Partnership trade pact a success.

Some might like to believe that the focus will turn back to the UK and Europe once Obama leaves office. If that is the case, they should listen carefully to Hillary Clinton and even some of the Republican candidates. Asia will remain an important plank in the strategy. This does not mean though that the special relationship between the UK and US is not valued. Old friends are important but if the UK decides to leave the EU, then it would not be surprising to see the US forge newer ties with its European peers.

Americans may hold onto the past but moving forward is ingrained in the culture. Anyone who doesn’t understand that should move to the back of the queue.

Lynn_DSC_1706_WEB

Lynn Strongin Dodds

[divider_line]©BestExecution 2016

[divider_to_top]

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA