Home Blog Page 579

The Value Of The Human

P24_Q2 15
Daniel Bisaillon, Head Trader, Equity Markets, La Caisse examines the ongoing role of the trader.
There are limits to human intervention as even the best traders can’t watch everything at once. When crashes happen – if everyone is on the same side of the trade and algos react in the same way – human intervention can be very important to slow it down.
On single trades one of the most important skills is finding market colour and what’s going on in the name and sector. The human is intelligent and has the instinct and the knowledge to build the complex data together. Then the decision has to be made about how to execute, whether to use a broker, to use an algo, or whatever. You can use a machine to do the trade, but a human is needed to make the decision about how to trade in the first place.
It varies by market but certainly in Canada you need to make the phone calls, use the right broker, find the right clients, and it can be difficult. The human is a must in that process in a market of this size.
Finding the right price, finding the liquidity, working with the PM for their views on the market, requires more inputs and complexity than a program can handle. Computers are becoming more capable, but there are limits. They often react to the same inputs in the same way, which leads to events like the Flash Crash. People can calm it down and take that step back.
There are a lot more actors in the marketplace now, and you need technology to interact with the other technological elements of the markets, but then we need the humans to interact with the other human elements, which is what allows us to get the best trades, especially in illiquid trades. If the inputs for the machines are hidden by a lack of liquidity, the machine has nothing to go on. People have to do that digging and build the colour. I don’t want to compete with the computers buying a single lot of shares, I want to do blocks and bigger trades, which is more difficult to get purely electronically. And if I were to rely on purely electronic means, it is much easier to give away my intentions and the size of the positions I am trying to build. I can work it more subtly as a human and get the best execution. There is an ever growing role for technology, but the fundamental importance of the human remains.

Big Data’s Role In Compliance And Risk Management, Roundtable Coverage

Big Data's Role
By Peter Waters, GlobalTrading
On the 11th March a wide range of industry participants gathered in London to discuss the latest matters of data management and visualisation, and how compliance and regulation need to embrace the latest technologies to stay ahead of technology and provide positive benefit to the firm as a whole.
An initial area for discussion was the challenge of finding the balance between having expertise in house and being able to make the decision on what solutions might or might not be fit for purpose.
One way to work towards this is through collaboration across a firm. Everyone has to address this issue. The ongoing regulatory push is to bring wider transparency across asset classes and markets – OTC derivatives; swaps etc. Rules are the same, as are the problems and challenges, but it can be hard for people to change workflows. Collaboration across market participants and vendors is also increasingly needed. Investment banks need to agree on priorities, namely where they are drawing data from, and how to manage risk.
Next the panel moved on to discuss wider trends in risk and compliance. To date many compliance investment decisions are being made on a back foot. It was suggested, though not all agreed, that firms need to get rid of excessive staffing in compliance and invest in infrastructure/systems.
Discussion centred around the fact that the quality and quantity of data is much better than it has ever been. That said, firms need to better identify and clarify what particular aspects of the data they need to figure out. Data needs to be cleverer to offer greater value, but it is a lack of discipline that means individuals can’t articulate what they want out of it. There is seemingly a lack of skills in financial services, resulting in a trend in managed analytical services. External managed services may be a solution.
The room soon moved on to discuss what compliance officers can do themselves, and how other members of the broader firm can assist. It was determined that firms should also be looking at infrastructure and capturing information properly and efficiently. They have to adjust systems and processes much more rapidly, embracing a model of more fluid adaptation.
Bruce Zulu
Because of limited resources (both human and technological) and because there is so much data from so many sources, firms are trying to figure out how to best exploit it. Is there a roadmap that can be agreed and followed?
One difficulty highlighted is that analytics means different things to different people and technology isn’t necessarily the problem. People need to clearly define what they want to get from the data, and then share across the firm. The problems emerging in Europe when conducting cross border trades has been evident in Asia for some time.
One positive developing trend is that a lot of banks are standardising underlying systems. Alignment is happening, meaning that data is coming from one centralised source within a firm. But this development is not going to happen overnight. The changing regulatory burden, and how it continues to shift from the regulator to market participants is equally evident.
Angus Pearson
Brandon DeCoster
There is however a disconnect between the financial services industry and other industries in terms of how the user experience develops. There is a need to get dashboard designers involved. It’s very easy to give a terrible interface because there is a lack of incentive to invest and firms are choosing to not spend unless it is deemed an absolute necessity. Industries such as healthcare and some engineering sciences, offer users of data the opportunity to not only access large amounts of data, but also more intuitive interpretation. In financial services, for example – highlighting a position that that breaches an accepted standard deviation may be more valuable in terms of identifying any immediate risk to the firm, than concentrating on the overall trading picture.
It was generally agreed that the scope of products – alerts, flags and other user interface points, have to get smarter. The ongoing battle is one of accuracy vs. precision, as when data quality drops there isn’t the capacity to adapt to the drop in quality of data.
The debate moved onto the role of cloud computing, and how cloud services could help collaboration between regulators and market participants, and how data feeds could be shifted to a public cloud. This is a viable gateway to have data routed to your own private cloud. But a key concern hinges around privacy for internal data. Regulation tends to be nationalistic in approach which is particularly challenging in both Asian and European markets.
Michael O'Brien
The elephant in the room, as decided by the panellists, is that the regulator has outpaced those involved in
compliance and technological advancement. End users are not making the most of the products that they have access to. People don’t have the skill-set necessary to build their own technology.
P32_Q2 15A balance in a firm’s team is needed so they understand what they are talking about and individual decision makers are properly informed to enable them to do so. There is currently a disconnect between IT and everyone else. IT builds what they think a firm needs, but often it hasn’t been relayed what the technology is needed for. Technologists often fail to deliver because they are not practitioners. A balance is obviously required.
The broad conclusion was that people’s skills need to be updated so that the collaboration between developers and practitioners is much easier. The skills should flow in both directions, and that would help reduce unnecessary expenditure. More effective data visualisation and processing would support this, by helping compliance and risk officers exploit the data that they have to the full.

Dealing With Unbundling

Jason McAleer, Head of Dealing, Jupiter Asset Management, examines ongoing change in commission management and its impact across the buy-side and sell-side.
Jason McAleerAsset managers are now being a lot more explicit about how they spend and what they get for their money. This is being done through clear voting, identifying clear research needs, and generally, with asset managers appreciating that the money they spend on research is effectively their own. With greater care and due diligence about the research spend, the buy-side can make a big difference to how unbundled they are.
Drivers towards greater unbundling seems to come increasingly from the regulators rather than from the clients. We, of course, act in the best interest of our clients, but with the push from the regulators, we are having to engage much more in the process of thinking about what we receive for our cash. In a pure unbundled world we would be allocating individual brokers specific amounts for defined permitted services; bespoke analyst meetings, research, execution. Once we’ve established what we are paying for, we can start to look at budgets and managing that expenditure. We are better able to break out what we are paying, which makes us more accountable to our clients, and more transparent to the regulator.
Sell-side impact
It is a very different matter when it comes to the sell-side. For some firms this drive towards full unbundling is undoubtedly a good thing. But for others it could cause their models to struggle, and that is definitely something we should be aware of when making changes to how the market functions. The smaller independent research firms are highly valued and managers do seek them out. Our ongoing and well developed use of CSAs is very helpful in breaking out payments to the smaller firms to get specialist research. But as we have seen, there could be changes to how CSAs function (or indeed exist), which would have consequences here.
It is not only the direction of change towards unbundling that is important, but how that change manifests and what tools we can use to allocate our spend. There are vast differences between different buy-side firms. Dependent on the budget, the style of the house, you can break out your spending by fund, by individual manager etc, and there needs to be clarity as to how we pay. There also needs to be a better two way conversation with brokers offering their services through more of a menu – this allows us to access their services in a way that both parties are happier with. We need to know the true cost of research and the true value that it provides us, just as much as the sell-side needs to start giving value to these metrics. Mid to small sized brokers whose model was to capture flow to pay for research could suffer as these changes manifest, and we could see mergers off the back of this.
One consequence is that more and more analysts are leaving the sell-side research houses, either to go independent or to move into the buy-side. This is an ongoing trend, and one that needs more time to develop, but buy-side firms are starting to look at the potential benefits of building out bigger in-house research teams.
Buy-side management
At Jupiter we are currently building out our commissions management policy – taking key aspects of how we pay for research and pay for execution, and putting that into a formal policy. This allows us to better formulate research budgets across strategic groups – rather than have a budget that can run away this allows us to be much more focused on consumption and to properly value research that’s voted for. We are getting there slowly: as mentioned, it is a difficult conversation to get the brokers themselves to value research separately and to put a firm price on their offerings, and we need to work closely with our own fund managers . We are starting to work with a commission aggregation firm that allows us to organise and manage much of this operationally.
There is an angle to this which both benefits internal transparency and external transparency. By being more specific in information from our fund managers when they are voting we are much better able to put a specific dollar value on a given piece of research. That level of detail is very helpful for us. It is still very much more art than science – until brokers come up with tariffs – the same research will be worth different amounts to different people.
One slightly worrying consequence of this drive towards more specific pricing of research would be if the investment banks set that price targeted at the largest asset managers. We could see many small and mid tier buy-sides simply being priced out of being able to afford the research they need. So with smaller sell-sides suffering if they can’t use natural flow, and with buy-sides being priced out of the market, there is a very real threat to the nature of the industry in the UK if this is not properly managed.
Global
There is an ongoing discussion as to how this feeds into a global conversation. The UK and Europe have been developing unbundling for a long time – pre MiFID I and into MiFID II. The US predominantly pay for research via soft dollar arrangements. In Asia global firms are starting to become more unbundled, but in less developed markets it is still very difficult. On the sell-side they can often see our money coming to them as being from us as a whole client. The specific split for research and commission doesn’t always happen resulting in a bundled view of the revenue. And I think that needs to change, as again that would give them a view as to their revenue flows, which allows them to cost back to us a lot more clearly. There is a debate as to whether Europe and the UK going alone on this opens us up to an arbitrage – why would an asset manager work in these environments that are more regulated than in the US or elsewhere?
We do need much greater transparency from the regulators as to what they are likely to implement and need to make sure that the CSA systems we are building out will be useful beyond January 2017. At the moment we don’t have that clarity and it is causing us difficulties.
To summarise, we need to become a lot more precise in how we are putting a value on research and commissions, and the sell-side need to become a lot better at putting a value on their products. Once we have our budgets and they have a price list, we can see how to work our needs and budget to that. It will be tough, but with CSAs and systems in place we can make the industry more transparent, and hopefully not disenfranchise too many of the smaller firms along the way.

Boarding The Train To China

Chris Seabolt_15
Chris Seabolt, Asia Head of Trading at US-based Fidelity Management & Research looks at the development of their use of the Shanghai-Hong Kong Stock Connect, and its role in gaining China exposure.
What enabled you to use the Connect more than other firms?
When the initial announcement of the Stock Connect was made, US-based Fidelity Management & Research Company (FMRCo) realised this presented a historic opportunity to invest directly in China that we needed to prepare for. We set up an internal task force across a number of functional areas such as trading, operations, compliance and legal to address the various aspects of the Connect.
This was no small task – these were new market regulations covering a new trading mechanism; we had to develop an approach that fit our needs and risk tolerance; agreements had to be reached and accounts opened with brokers and custodians; trading, compliance and custody systems were set up and tested; education and updates were provided to senior management, our investment teams and oversight groups. Ultimately, this intense focus and effort across these various functional groups allowed us to address many of the issues that many other large investors could not.
How has this been advantageous?
FMRCo was able to start building A share positions through the Connect starting November 17th, 2014 and our funds represented a substantial portion of the Northbound trading quota on the first days of the program.
Taking advantage of this early window of trading allowed us to establish substantial positions across a number of A share companies on behalf of our fund shareholders, and since then we have seen a substantial rally in the Chinese equity market. While we always take a long-term view, our timing was fortuitous and this has been a big benefit to our funds and shareholders.
Best/worst aspects – what would you change if you could?
The pre-delivery requirement and the nuances around the tight settlement cycle are the two areas that have been the most challenging for us. In order to avoid pre-delivering stock on a sale trade, we have a process with our local custodians and their brokerage arms.
Still, we are exploring alternatives to have a full suite of execution counterparties for our A share sales. In terms of the settlement cycle, the non-standard DVP process is something that’s unique to China and a departure from our standard workflow. These are the two areas we would like to see enhanced. In the meantime, we are satisfied with the process we have developed to operate effectively in the Connect and we look forward to its ongoing evolution.
Enhanced model – are you going to engage, why/why not?
Similar to the first iteration of the Connect, we are dedicating time to understanding the new model. While it could represent some improvements, such as the elimination of the pre-delivery requirement, there are still some issues we are facing relating to the tight settlement cycle. We will continue to use our current Connect process while we monitor ongoing developments.
Future of Connect – do you want more exchanges, more flow, more institutional firms involved?
We look forward to the anticipated addition of Shenzhen-listed A shares and additional Shanghai-listed companies, which should increase the investable universe of Chinese companies that we can access through the Connect.
The potential addition of A shares to key market indices, which many global asset managers use as tracking benchmarks, is also a development we will watch closely in the coming months. FMRCo views the Stock Connect as a first step in the broader opening of Chinese capital markets which presents an exciting opportunity for investors globally.
P9_Q2 15

Top Takeaways from the 2015 FIX Americas Conference

Candyce EdelenBy Candyce Edelen, CEO, PropelGrowth

Last week, on April 15, 2015 the FIX Trading Community put on an excellent half day event for the Americas Conference. It was a substantially smaller event than in past years, in an effort to make the content more focused and the event profitable. The team accomplished both goals with
Here are some of the highlights of the event.
How Regulation is Like Debugging Code
My favorite session was the keynote. Gregg E. Berman spoke. He served until recently as the Associate Director of the Office of Analytics and Research in the Division of Trading and Markets at SEC (he’s stepping down this month). Throughout his tenure at the SEC, he has focused on using data-driven analytics to inform policy.
He compared developing regulations and altering market structure to the process of developing and debugging code. This was an interesting and very relevant metaphor. As Berman pointed out, there’s a difference between debugging and tossing out code. You have to know what the program is actually doing versus what it’s supposed to do before you can debug issues. You can’t just delete a line of code and expect things to work properly. Following this approach, his department has been very focused on understanding the implications of changing regulations.
For example, the JOBS Act included an item about decimalization. The legislators posited that if they widened ticks for small caps, there would be more profit for brokers to make markets, which would lead to more liquidity for small cap firms, which would create more jobs. This seems like a reasonable assumption.
But Berman’s team dug into the data and found that small and mid cap stocks are not homogeneous. There are wide variations in the levels of liquidity, the average daily spread and volume at depth. In 2013, approximately 41% of small caps were trading at an average daily spread of 4.5¢, 14% had a spread smaller than 1.5¢, and another 14% traded at a 15.0¢ spread (I was scribbling madly, so I might have recorded these numbers incorrectly, but you get the point). So you can’t just raise tick size across the board and expect a positive result. A tick pilot needs to be very specific.
Berman also addressed Maker/Taker changes. He talked about a Nasdaq pilot that reduced maker/taker fees on 14 stocks. They found that the change reduced Nasdaq’s market share overall. But the reductions were very different for different stocks. For example, Bank of America (BAC) saw a very slight increase, while several stocks experienced as much as a 12% reduction in market share. Liquidity moved less at depth than at top of book. Berman urged the markets to do more testing on this topic.
I hope that Berman’s influence over the SEC will outlast his tenure. Several people expressed hopes that his work will help prevent unintended consequences like those caused by RegNMS.
Order Execution Transparency
The buy-side working group is still working on getting more transparency from the sell-side about the routing decisions that are made as their orders are executed. I wrote about this topic after last year’s regional briefing.
The primary goal of the initiative is understanding how the buy-sides’ orders are being represented to the markets. They’re seeking information about execution venue selection and routing decisions, but they want to know more than just where the order was executed. They also want to identify venues where they don’t get executed. I think part of this has to do with their desire to understand the motivations of the sell side in their routing decisions. For example, how much is routing influenced by maker/taker pricing incentives? Are the routing decisions that are motivated by rebates resulting in fills, or are they getting routed away? But the panelists insisted that it’s not just about transparency in routing decisions; it’s also about consistency – they’ve been getting very different results across firms, and they want to be better at predicting outcomes.
Issues making this more complex include timestamp inconsistency, the lack of maker/taker flags from the exchanges, and inconsistency in what’s being asked from different buy-sides. The committee is trying to make the requirements as consistent as possible to make it easier for the sell-side to comply. They appear to be making progress. One panelist/buy-side working group committee member said he’s now getting venue information from close to 90% of his counterparties in the US and 70-80% globally.
Automating the IPO Process
I was very surprised to hear how little automation is in place for running IPOs. Buy-sides generate orders based on their desired participation levels, and the trading desks PHONE these orders into the syndicate of brokers running the IPO (yes…they still phone them in). After the deal is priced, the buy-side gets a portion of the allocation they asked for at a particular price. The broker CALLS the buy-side trader with the allocation, which the trader manually enters into his system. The entire process is non-transparent and has many opportunities for mis-information or human error as the traders talk to multiple brokers in the syndicate. These are typically the largest orders on a trader’s book, there are frequent modifications, the orders live over the course of several days, and there is substantial risk of error at every step.
The buy-side committee has diagrammed how the IPO process works and will work on developing a model for replicating the process electronically, which should substantially cut down on errors. But it’s likely that the industry will take a long time to adopt this approach, even after the committee finishes the initial design and specifications. (After all, how many firms are still using FIX 4.2?)
Automating the Post-Trade Process
Allocations were automated in 1997, but FIX automation stopped there instead of completing the full lifecycle of a trade. Post trade processes run much the same today as they did 20 years ago. There is still a lack of straight through processing through confirmations and clearing.
The buy-side working group has been encouraging brokers to start using FIX for allocations and confirmations. According to an article Scott Atwell wrote for FIX Global,the benefits of this automation include efficiency gains, improved straight-through processing, and quicker identification of issues, all of which provide significant risk reduction and cost savings.
One of the panelists talked about how they adopted FIX for these processes. He said 60% of trade breaks happen in settlement instructions. This is particularly problematic in emerging markets. Now, their system allows them to identify exceptions in real time, know which item had an exception, and identify the precise cause of the exception (e.g., commission, instructions, etc.) without hunting through data. The panelist said his firm has seen substantial ROI from the deployment.
Cyber Security
I must admit that the cyber security panel puzzled me. Based on what the panel discussed, I was not convinced that I see a role for the FIX Trading Community in this topic, if they restrict the discussion to how it relates to the FIX protocol.
They talked about their effort to categorize threat actors and the need to develop layers of protection. The panel pointed out that FIX networks rely heavily on network access protocols to protect against malicious attacks. One speaker talked at length about the brittleness of FIX engines. He described how in his testing, FIX engines fall over if he adds too many characters to the client order ID FIX tag.
Admittedly, there is not much security built into the protocol. Most firms and most FIX engine vendors prioritize low latency over security. The panel pointed out that while FIX to FIX is an authenticated protocol, the FIX engine is talking to multiple internal unauthenticated systems (via APIs). The risk is that if an attacker gets in via the FIX routing network, they can gain access to the rest of the trading infrastructure fairly easily.
The panel made a big deal about the vulnerability of the protocol and the FIX engines to hackers. But when I pressed them during the Q&A session, they would not (or could not) point to any instance where hackers had gained control of a FIX engine. The fact is that most hackers will gain access through a “front door.”
Someone in the organization receives an email, clicks a malicious link, and malware worms its way into the network.
In my admittedly under-informed opinion, I concluded that focusing our attention on the FIX engine is like tightening the bolts on the bank vault while leaving the teller passwords taped to their workstations.
But as I was writing this post, I decided to check out a document they mentioned during the panel. Back in 2012, the SANS Institute published a detailed white paper that talks about how a firm’s FIX infrastructure could be exploited. This is critically important reading for anyone in electronic trading with responsibility for security.
Firms have to assume that they’re going to be compromised eventually. Usually, by the time a firm discovers a breach, the hacker has been inside for months or even years. Based on that SANS report, there are significant vulnerabilities that need to be addressed.
A Job Well Done
Congratulations to the organizers and committee members who planned and executed this event. The team clearly made some really good decisions that served the conference and the community well. You all did a great job this year!! Thanks also to Thomson Reuters for hosting and to all the sponsors for contributing to this event.

Current Evolution Of The Buy-side/Sell-side Relationship

With Saku Sänkiaho, Senior Analyst at Pohjola Asset Management Execution Services.
Saku SankiahoThe buy-side has a growing desire to take more control. Increasing scrutiny of buy-side trading and execution practices by the regulators, senior managers and clients is driving transparency and the need for greater accountability.
And there has been a growing suspicion from the buy-side that they need to take greater control in order to avoid suboptimal liquidity and execution. The buy-side has been working hard to increase their awareness and improve their skill-set when it comes to trade execution.
As a consequence of this growth in their knowledge, the buy-side increasingly demands customisation and acquisition of different trading tools and algorithms. When we developed our own buy-side smart order router (SOR) with ITG in 2011 the idea was to take full control of the routing and execution. The need for control is one of the most significant trends coming out of the buy-side at the moment in terms of technology and development.
Catching up with the sell-side
Also contributing to the increasing involvement from the buy-side, is the way that the buy-side is now better able to measure their performance. Using better developed TCA, traders can monitor comparative performance of algos and execution and have a more informed discussion with their brokers; the buy-side can tell their brokers what they would like to see and then of course the sell-side can help them deliver. The process will be more consultative because both have similar levels of understanding about market structure and the needs of traders. This means that there is more interaction between the two, which is a good thing.
Currently, many buy-side firms use external TCA providers, which gives them standardised, comparable TCA reports that they can use to directly compare their brokers. Relying on broker reports is difficult because different metrics are used; you end up comparing apples and oranges. Independent execution analysis that successfully combines macro and micro level TCA is a way for the buy-side to look under the hood to fine tune the car, so to speak; firms have to be able to capture and analyse their own data.
When we started the buy-side SOR user group, which currently consists of 15 different buy-side firms, the whole approach was to share ideas with each other. Because everybody has their own experience from the market, we can work better once we have shared these thoughts. Everybody has a different trading profile and different benchmark, but we can help standardise the data and conversation. The aim is to also give the members of the group more data to use in discussions with their brokers.
There is a definite evolution in the way that the collaboration has increased between the buy-side and sell-side, but maybe there is more that brokers need to do to move that forward.
What does the future hold?
It is a long term trend that the buy-side are taking more control. First one started to use algos to implement electronic trading strategies, then the focus was on the liquidity seeking algorithms. Now more focus is put on which venues to access; as the buy-side increase their ability and knowledge, they are becoming more demanding.
After increasing our knowledge on electronic execution and order routing, for us, the next logical step was to create the buy-side controlled SOR to offer an unconflicted way of interacting with the aggregated liquidity without unnecessary intermediation.
However, every firm develops their own methods in regard to how they would like to interact with the sell-side; what level of collaboration is required and on what frequency. But I think that there will definitely be even more involvement. The collaboration between the buy-side and sell-side will evolve towards more standardised practices, transparency and comparability of data.
Tightening regulation will also increase the execution quality that brokers provide and drive greater transparency between the buy-side and sell-side. When research and execution are unbundled, the trading team has more autonomy with whom to trade. In this case, the only way for an executing broker to win mandates is to provide order execution of high quality. Transparency is now seen as an integral part of a high-quality execution service.
As the buy-side increasingly take control, and collaborate more frequently with the sell-side, technological advances will be a fascinating area to follow. There will be a need for sell-side technology and market knowledge, but they do need to adapt to stay relevant.

The Fixed Income Status Quo, And Project Neptune

By Sassan Danesh, Managing Partner, Etrading Software
Sassan DaneshToday, reading about the challenges faced by the credit market is a tale with recurrent themes of tightening regulatory controls, changing business models, market structure, and discussions about the increasing impact of electronic interaction in the marketplace.
Historically, the credit market was an OTC voice-traded domain. In the last decade, technological advances have prompted an increase in electronic transactions in this space. Yet, this dematerialisation is largely disjointed as market participants have implemented bespoke technologies in a silo approach to regulatory reform. Digitisation of the credit markets is proving both a cause and a solution to fragmentation.
As dealers’ inventories are reduced in the wake of regulatory changes, so is their ability to act as effective market makers – making liquidity more difficult to both source and leverage. For investors, further challenges are found in the distribution and sorting of pre-trade data; this is the data distributed by market makers that allows investors to locate and obtain the best price on assets for their portfolios. Currently, they receive this data from multiple sources in multiple formats and have to rely on a range of methodologies to sift and collate the data.
In a broader sense, the credit market could be seen as being in good health. Since 2007, corporate bond issuance has increased – growing from $600bn in that year to $1.8tr in 2012. This has coincided with a decline in direct bank lending and a very low interest rate environment. However, the start of 2015 has seen the lowest levels of corporate bond issuance since 2010, down around 20% from the same period in 2014. There appears to be some correlation with the growing uncertainty in the macro-economic environment. Falling oil prices, divergent monetary policy in developed countries, reduced growth in developing markets including China are having an effect. Political instability is also a factor in the Eurozone and Greece and even the UK. A jittery market is a difficult place for issuers to feel they have value but it’s still too early to tell whether or not issuance is slowing. Should it slow, it would be another pressure on the liquidity of the credit market as bonds mature and companies reduce their debt profile.
The regulatory landscape is also changing this year. MiFID II rules will be firmed toward the middle of 2015 with Basel III and EMIR also phasing in during the year. Basel III’s particular focus on the liquidity coverage ratio and its treatment of corporate bonds as ‘High Quality Liquid Assets’ with a haircut of 50% will further push banks away from holding these types of instruments. Whilst this is only true for issuance with a credit rating between A+ and BBB, this nevertheless covers a significant portion of the corporate bond market. Demand for high quality corporate bonds will increase further as EMIR’s centralised clearing for OTC derivatives begins in Q1 this year. Lower quality corporates will incur a higher haircut charge as non-cash collateral, should they be eligible at all.
All of this means that addressing the issues of liquidity is paramount in the credit market and there are a number of initiatives attempting to fix the problem. Some are trying to simply change the market structure – for example by enabling the buy-side to trade directly with each other in a ‘trusted player’ market place. These are proprietary, closed markets that have specific access rules and behaviour. This should help access the large portion of assets that are currently sitting as inventory on the buy-side. However, for those bonds which only trade infrequently in the open market there is the significant issue of finding a price. Not all investors are willing to become price givers. Evaluated price generation is becoming increasingly popular but it is difficult to validate ‘best execution’ for many illiquid assets.
A number of other initiatives lean on existing execution venues or trading platforms. They encourage the sell-side to publish their inventory onto the platform and then will aggregate it for the buy-side. The best approach is still under contention with some saying that the focus should be on the most liquid part of the market whilst others saying the opposite. One thing does appear to be clear – almost all of the current set of initiatives are closed, proprietary plays to enhance existing enterprises despite some large participants and regulators claiming that standards are the key to solving the current challenges in the credit market.
Project Neptune is an initiative married to the standardisation ethos. Sponsored by institutions from both sides of the market, the project’s first phase in Q4 last year was focused on developing open standards, using the FIX protocol, to support the workflows of structured pre-trade data exchange. With these now completed and ratified under FIX governance, the project is now moving into an implementation phase. The idea is not to compete with trading venues or liquidity platforms but rather create a network that can be used as the basis for pre-trade communication. The collaborative nature of the project ensures that there’s no attempt to capture advantage – all sponsors are clear that this is just opening a direct channel between banks and their clients that uses open standards. This means that, rather than as they do today with their customised delivery mechanisms, banks can deliver structured data to whomever they wish without having to support complex data mapping and transformation systems; improving control whilst reducing risks and costs.
The use of open standards in this way – development, ratification followed by a shared implementation points us towards a different approach of dealing with market structure issues. Many of these issues have, as mentioned earlier, come about because of the swift implementation of technology – that rapid growth left little room for standards especially in the OTC space. With the new focus on costs, liquidity and fragmentation, it seems that it is now time for standards to play a stronger role in restructuring the marketplace and bring supply and demand together.

Project Neptune; A Buy-Side Angle

With Lee Sanders, Head of Fixed Income and FX Dealing, AXA IM
Lee Sanders_0There are a lot of new initiatives that have emerged in fixed income. The principle behind Project Neptune is to simplify these practices. It’s equally relevant to both the buy- and sell-side as both incur a lot of costs from accessing each other.
From the buy-side perspective: getting your order management system provider to connect to more than one third party platform at any one point is very challenging and might require a six-month lead time (at best). So imagine that an optimal execution framework identifies 30 initiatives; even if we think only ten of them are relevant, it would take a lot of intense work over a one or two-year period to integrate access to those ten platforms with your OMS. You’re incurring a huge amount of expense because you pay your OMS providers to do that work and additionally that invariably means that your own very limited IT resource is stretched even further.
While to some extent we feel that we will lose some of the competitive advantage that we currently have in our set up, initiatives like Project Neptune, which is intended to reduce costs by standardising connectivity between market participants, are of great value to us and the industry as a whole. I think that reflects the general principles of the FIX Trading Community as well as addressing some of the specific issues that are evolving in terms of market structure and liquidity in fixed income at the moment.
We support Project Neptune because the concept is platform neutral across OMS and EMS. By using a single standard protocol all market participants can talk to each other and the underlying principle is the platform’s agnostic nature. You can get to those platforms via an OMS, via an EMS, via a GUI, anyway you like, but it’s the language that connects it all up.
If you sit down and look at all these platforms, you have to be agnostic. This is at the core of what we are driving towards. To get a consensus view on where we put our attention. Neptune makes more sense than anything else as it is about trying to get as near to the mid-point of an execution as we can or better.

Flash crash rumbles on : Lynn Strongin Dodds

Hounslow-Lone-Wolf

IS HOUNSLOW HOUND A LONE WOLF?

It is no secret that high frequency traders are an ongoing source of discussion. The recent arrest of HFT trader Navinder Singh Sarao and his alleged role in causing the 6 May 2010 flash crash adds another iron to the debate. He was not part of a large organisation but a one-man band in Hounslow (a borough in West London) and the question is how could a sole trader wreak so much havoc on the market?

The case has shocked the trading world. The regulators have spent untold hours investigating one of the most mysterious episodes in market history and charges were only brought against Sarao after a whistle blower provided hundreds of hours of analysis of the trades. He is now charged with 22 counts of financial crime and is fighting extradition to the US.

Sarao is accused of employing an automated trading programme to manipulate the market for S&P 500 futures contracts on the Chicago Mercantile Exchange. The Justice Department alleges, between 2010 and 2014, that he manipulated futures markets through illegal computerized trading techniques known as spoofing and layering, earning $40m in profits. It claims that these types of activities were a major contributor to the selling pressure that caused the Dow Jones Industrial Average to topple 1,000 points on 6 May 2010 before immediately bouncing back on that same day.

US investigators also allege he conjured up to £600,000-a-day by creating fake trades and then cancelling them as they were about to go through. They were being modified within 0.2 seconds, which is fast, but as Sarao has countered, “I trade very large, but change my mind in a second”.

On the surface, Sarao does not seem to be one of those HFT traders described in Michael Lewis’s Flash Boys. In fact, he referred to himself as “an old-school point-and-click prop trader” in a survey carried out by the Financial Conduct Authority (FCA) which assisted the US investigation. However, details of his computer set-up are sketchy, with court papers claiming he used a commercially available trading platform with a modification he called Navtrader, designed with Edge Financial. Emails also suggest he was frustrated by the bigger, more sophisticated operations in the market.

It was also reported he complained to the FCA last year that his orders placed through the “iceberg” function at the CME – which allows traders to place orders with the size partially hidden – were being trampled by fast-moving algorithms

The jury is out as to what really took place but many believe fighting extradition could be a losing battle.

Lynn Strongin Dodds, Editor.

©BestExecution 2015

[divider_to_top]

Profile : Steve Garrard : Citi

Be28_Citi_Steve-Garrard_375x484

NEW CHALLENGES, NEW OPPORTUNITIES.

Be28_Citi_Steve-Garrard_1600x456

Steve Garrard, EMEA head of equities sales trading at Citi, explains how the broker is responding to a changing environment.

What impact has regulation had on the equities trading environment over the past year?

There have been several changes over the past few years which have had a significant impact. They range from the French and Italian Financial Transaction Tax to the more recent implementation of T+2 settlement to the Financial Conduct Authority’s thematic reviews on best execution and the dealing commission regime. In many ways we see these changes as new opportunities. For example, take the Scandinavian markets which are dominated by local brokers. Unbundling enables Citi to offer a global commission sharing arrangement and access the flows and liquidity which previously resided domestically in the region.

What effect do you think MiFID II will have?

The 4% and 8% caps under the reference price waiver will certainly have an impact. Our clients tell us there is value, and a need for non-displayed liquidity to achieve cost effective execution. The caps could potentially impact that. One of the outcomes is that we may see an increase in blockier type trading via the large-in-scale (LIS) waiver which is not affected by the caps. I also see flow moving from non-displayed venues to regulated exchanges and lit MTFs. The LIS waiver, we hope will help to improve execution quality for large orders.

Has the ability to achieve best execution improved since MiFID I?

MiFID I has resulted in an increase in fragmentation and it has in theory become harder for the sellside to achieve best execution. The challenge for the buyside is that best-ex is defined by the sellside that is executing the order, so there is no standard. However, the industry has been focusing on building the tools as well as fine-tuning and adjusting the strategies accordingly. We assess new venues, the fragmentation of the liquidity and our abilities to achieve best execution. I think the monitoring process for best-ex has become relatively commoditised and available to not only the larger but also smaller to medium sized clients. It is incumbent upon the sellside to have fully disclosed policies and procedures around venues they interact with and who interacts with client order flow.

There has been a lot of press about the buyside taking greater control of their executions. How has Citi responded? Did you have to restructure your execution services?

We have encouraged our buyside clients to take greater control of their execution process. We offer the entire product suite and execution service ranging from low to high touch which is still in demand because of the proliferation of venues. Clients have also increasingly preferred to have a single point of contact over the past two years. They like to have one person who can assist with the execution of the order but also has the skillset to trade via high and low touch channels and across different geographies. Multi-asset skills are also in demand as there are several multi-asset funds holding equities.

Clients though do differ and some will put programme and electronic trading together for a low touch service while others are prepared to pay for liquidity provided by a high touch sales trader. We are also seeing clients build into their order management systems the ability to trade on an order-by-order basis via a low touch electronic service with high touch visibility.

I read Citi made a number of senior appointments within its European equities business last year including Sam Baig as new head of execution platform for EMEA. What is the reason behind this as well as his new role?

There is no doubt that the market and trading has become more complex due to the rise of new venues as well the increasing sophistication of the buyside. We decided we needed a much more thoughtful approach to execution in terms of how we look at the venues we connect to, the composition of those venues, the depth of our book, the bid/offer spread and the profile of market makers.

We also have a large inventory of stocks that can be made available and Sam was brought in to look at all the pieces of liquidity that reside on the trading floor to see how we can better consolidate that liquidity and the crossing opportunities that can take place.

Can you please tell me a bit more about your new Total Touch product and other products in the pipeline?

We launched Total Touch in the US in 2010 and it was extremely successful. It allowed broker dealers to transact hybrid block orders on a bespoke electronically basis. The average trade size of this hybrid block pool at Citi was 65038 shares versus average dark pool prints of 228 shares. The product suite supports over 3,500 securities worth $20bn in the Americas and we launched it in Europe in September 2014, with initially 560 symbols. It currently supports 680 securities across 13 sectors, 7 currencies and 16 countries in the region, representing approximately $3bn of actionable liquidity. The aim is to maximise block transactions whilst signalling and minimising leakage from a client perspective.

In Europe, we started the offering in ordinary shares and exchange-traded funds but have plans to roll it out into ADRs (American depositary receipts).

I read that Citi is among several banks looking to set up a new European equity venue called Plato. Is that still on the cards and if so, what are the drivers?

Very much so on the cards. Plato is a consortium of asset managers and brokers dealers, which are collaborating to create a not-for-profit trading utility in Europe. The consortiums key aims are to increase transparency, reduce trading costs, simplify market structure and act as a champion for end investors

One of the areas we are particularly excited about is the Market Structure Innovation Centre. There has been a lack of independent academic research covering European Market Structure and the Plato initiative will look at filling that gap by sponsoring academic research to improve innovation and help inform the market structure debate

What do you foresee as being the biggest opportunities as well as challenges this year?

In terms of opportunities, we are well positioned in the wholesale markets with our execution to custody (E2C) solution, which has become increasingly relevant for our clients in the current environment and facilitated by the recent integration of our markets and securities businesses. It cuts out the middle piece of the clearing and settlement process and reduces the cost per trade by automatically generating client settlement instructions based on predetermined information supplied during implementation.

As for challenges, the regulatory process will continue to be consuming for the business.

[divider_line]Biography:

Steve Garrard, who has been at Citi since 1999, is responsible for running equity sales trading within Citi’s EMEA Institutional Client Group. Prior to joining Citi, he worked as a sales trader at Smith New Court which was taken over by Merrill Lynch in 1995.

[divider_line]©BestExecution 2015

[divider_to_top]

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA