Home Blog Page 612

Announcement : SEFs – new research report

SWAP EXECUTION FACILITIES: TAKING THE NEXT STEP.

GreySpark Partners presents a report exploring the emerging SEF landscape, giving insight into recent SEF aggregation requirements and developments. Though some participants are making preparations for the advent of a new trading landscape, the general pace of change is sluggish and haphazard. SEF aggregators are both pertinent and imminent as banks are required to connect with SEFs, and their clients, in turn, will request multiple connections. This report lays out the functional requirements for a reference SEF aggregator solution.

Click here to purchase the report or to download it if you are a CMI Gold subscriber.
 
For further information on GreySpark Partners, go to Partners

KPMG launches its first investment fund : FT.com

KPMG ENTERS THE MARKET WITH A FUND INVESTING IN  DATA AND ANALYTICS BUSINESSES.

KPMG, the giant professional services company, will on Monday launch its first investment fund as it branches out into other areas in a sign of the big changes afoot in the financial services industry.

The move is significant as the fund will invest in data and analytics businesses, which are increasingly popular with investors as technology is considered a growth area that should benefit from economic recovery.

The group, one of the big four professional services companies employing 152,000 people around the world in 156 countries and with combined revenues of $23bn, is creating KPMG Capital, a new wholly owned fund.

Mark Toon, chief executive of the new fund, said: “With more data produced and stored in the last two years than in the rest of human history, many businesses are looking for strategic and practical solutions to manage the volume, velocity and variety of this data revolution.”

KPMG, which is a tax, auditing and advisory firm, has decided to diversify to use its expertise around the world to build a business in data and analytics. The fund, which could be the first of many, will launch with about $100m.

It will invest primarily in data and analytics businesses through strategic acquisitions, technology partnerships and other data and analytics capabilities.

The move follows last month’s announcement by Deloitte, another of the big four professional services firms, that it is buying Seattle-based social media marketing agency Banyan Branch for an undisclosed sum.

Deloitte’s acquisition is another example of how these big groups are looking to take advantage of the rapid changes in technology, social media and data provision.

The KPMG fund will have its headquarters in London and invest globally. It aims to invest in a number of critical business areas including enhancing business flexibility; finance; risk, regulation and compliance; improving workforce productivity; and customer and revenue growth.

The new fund will also co-invest, sponsor and partner other groups at early stages with the aim to encourage entrepreneurs to be bold, focusing on growth sectors such as healthcare, financial services, energy and telecommunications.

Separately, KPMG/Markit launched its first report on technology last month. This report will be produced quarterly, analysing UK tech clusters and tracking the sector’s outlook for employment and growth.

The new fund will not be open to third-party investment.

LINK

Source: By David Oakley, Investment Correspondent, The Financial Times.
© The Financial Times Limited 2014. 

Announcement : J.P. Morgan Joins Chi-X Global Ownership Consortium

NEW YORK – JANUARY 7, 2014.

Chi-X Global Holdings LLC today announced that J.P. Morgan has acquired an equity stake in the firm.

J.P. Morgan becomes the eighth member of Chi-X Global’s ownership consortium. It joins owners BofA Merrill Lynch, Goldman Sachs, KCG Holdings Inc., Morgan Stanley, Quantlab Group L.P. and UBS AG. Instinet Incorporated, the Nomura Group member that founded Chi-X Global in 2008, remains the majority equity holder through an affiliate.

Fumiki Kondo, Chairman of Chi-X Global, said: “On behalf of the Chi-X Global Board, I would like to welcome J.P. Morgan to the investor group. The addition of J.P. Morgan is an important milestone for Chi-X and is representative of the support that we continue to receive from trading communities around the globe.”

Tal Cohen, CEO of Chi-X Global, added: “J.P. Morgan joins our ownership consortium at an exciting time, as each of our markets continues to enhance its performance. We remain committed to working with the industry to foster healthy and vibrant equity markets that benefit all investors.”

Financial terms of the transaction were not disclosed.

[divider_line]

Trading Blocks

Todd Prado, Global Head of Equity Trading, First State Investments, examines the changing role of technology in block trading, and the qualities of the traditional block trader that can’t be replaced.
I view the role of anTodd Pradoy crossing network as a complement to the role of the sales trader who specialises in block crossing. Liquidnet is a very efficient, and cost effective method of crossing blocks of existing orders, but the key word there is ‘existing’, whereas the sales trader can do that same role as well, which is where they may be in competition, but the sales trader can bring things to the table that Liquidnet isn’t able to; tapping into potential liquidity.
It is this potential liquidity that can be really helpful, particularly if there is a sales trader who has experience and a lot of good relationships. The clients trust the sales trader and are willing to open up and tell him what they are willing to do. He will be a professional and he is not going to ruin their market. The key for that person is to get that critical mass so that the clients view that particular sales trader as the go-to guy, and people will approach him and he is aware of potential liquidity out there. It is a very specialised role; it takes a long time to earn that trust. It requires someone who has been in the region a while and someone who knows the markets, and more importantly knows the relationships. Liquidnet had, and maybe continues to have, a bit of an issue on the trust side – getting people to trust them and their technology as they are, at the most basic level, asking to peer into a firm’s blotter. It involves taking a leap of faith to accept that. The foreign based asset managers seemed more comfortable with that as they have been using Liquidnet for years in the US and they were familiar with that process. For the domestic houses that weren’t used to using Liquidnet it has definitely been more challenging to earn that trust.
Liquidnet has found a middle path; instead of scraping the blotter they allow you to submit something to a neutral platform that they will scrape, and you just submit what you want to that, which has eased some fears. It certainly helps that they are purely an agency broker and there is no propriety activity going on there.
Execution avenues
Obviously in some markets and, the US is the ultimate example, fragmentation is a reality, but we are increasingly seeing it in Australia, Japan, and it its looking like it will happen in Korea, and that drives the increased use of electronic trading and algorithms, in one form or another. The choice becomes whether to give an order to a cash desk, which places the order into an algo, or whether you place it yourself directly into an electronic trading suite that a broker has provided: the majority of orders are in an algo these days just because it has to be that way.
That being said, the peers I speak to are all facing the same issues; any one of us that wants to trade in size would prefer to trade blocks, but the issue is how do we find each other. Liquidnet goes a certain way to solving that issue, but there is still the need for a human element in there. There are a lot of brokers that are aware of this, but it is building it and getting people to congregate around that person or small group of people. Some markets are a little further ahead than others, in Australia there are people starting to get momentum on their block level trading and earning a reputation for putting a lot of blocks through. We haven’t quite seen that in Asia yet with the exception of one broker that is starting to make some inroads.
There are fewer sales traders covering more accounts now, which results in less attention to those accounts. There has definitely been an impact. I would suppose by function, as a lot of senior people have been let go that creates more opportunity for those that remain, but again it takes a while to build up these relationships. Is a fund manager who has been doing this for 15 years going to open up to someone who has been doing it for five? I’m not sure, but everything is a balancing act.

To read more Block Trading content, click here.

There are certain names you go to, but it is mainly just about getting the trades done. It is also quite market specific, for example in Hong Kong and Asia I use one guy, in India I use someone else, in Australia there is a third person – these are people I trust and they’ll make calls on my behalf, and they’re happy to give the order back if they can’t get it done, and that goes a long way to establishing the trust. For those firms that have the people who have been around longer, there is a higher likelihood that they will be able to get those trades done and they will get the orders as a result, perpetuating that cycle.
Bringing back the blocks?
I would certainly like it to go back towards more block trading. Algos and electronic trading are here to stay, but I would certainly like to put up big blocks, and that is a message I’ve been delivering to the brokers. There is definitely interest on the broker side in pursuing this as there is business to be done, but this doesn’t happen just because you decide to do it. However, this may already exist in the form of the equity capital markets desks, and I have been using these desks to get block trades done.
Sometimes you don’t want the sales trader to know about the order so you might move away from the cash desk and speak to someone that has those connections, and the EMC desks often have connections to the PMs who can get execute the blocks.
This isn’t just sales trading versus block versus Liquidnet; each has different characteristics and reasons for selection.

Blocks and technology

With Lee Porter, Head of Asia Pacific, Liquidnet
Lee PorterHow is block trading evolving with new technologies?
Trading large blocks of shares remains challenging, particularly in markets where volumes are thin and spreads are wide. Technology has enabled traders across the globe, including those in Asia, to efficiently access liquidity without moving the market.
Technology has increasingly allowed investors to buy and sell large blocks of shares in a safe and secure environment. Our members understand the need for an institutional network, where volume discovery complements the price discovery on public exchanges.
How are the sell-side trader’s relationships with the buy-side changing for block trading?
While interpersonal relationships are important, traders are ultimately judged on their ability to execute orders in the most cost-efficient and timely manner. The concept of relationships has also evolved with technology. Asset managers on our network today are connected to peers around the globe through technology and presented with trading matches in a safe and secure platform. These relationships, benefit from the network effect, where members are able to tap into large pools of liquidity from their desktops.
As many bulge-bracket brokers face challenges, including cost reduction, we have continued to see strong growth which is supported by rising demand.
What new technologies are needed by the sell-side to enhance the block trading desk?
The approach of most sell-side block traders hasn’t significantly evolved, in regards to new technology. It is still very much a relationship-driven business, which is about matching a buyer and seller through a personal relationship. There are today, however, more effective options available to traders.
It is also important to look beyond technology to meet the needs of the trader and help them grow in their roles. Liquidnet’s Headway Head Trader Programme helps buy-side traders develop skills for a competitive edge. We work with traders across the world, including those in Asia, to develop management and leadership skills, understand the role of technology and how to work successfully as part of a fully integrated team within large institutions.
How has the trend of slicing trades affected the buy-side attitudes towards the cost/benefit of traditional block trading?
Algorithms continue to see increased demand in Asia and globally. What the buy-side understands is that simply slicing an order doesn’t necessarily lead to anonymous trading, as it can result in an even bigger footprint in the market. There is also a cost factor associated with higher volume of trading.
Ultimately, the preferred outcome for many asset managers is to trade large volume of shares in a single transaction without market impact. At the same time, a buy-side trader does need to have the full range of solutions – and the ability to draw upon the solution at the correct time – for the best performance.

To read more Block Trading content, click here.

For example, when a trader is unable to complete an entire order, it may make sense to access additional liquidity in the broader market through an algorithm. We believe that an algorithm is only as good as the liquidity it can access.

The traditional trader

With Shane Wallis, Managing Director of Block Trading & Execution Services, CLSA
Shane WallisBlock trading has evolved a long way since its original inception, and the advent of the electronic trading age has continued to advance that change as increasing percentages or orders are broken down and executed electronically. However the desire of the buy-side to execute large positions through blocks, the fundamental skills required by the block trader, and the role of the trader in being aware of movements in the market and finding the other side to large trades remains the same.
The increased speed of information has had an impact on the risk factors in block trading, as buy-side traders need to be increasingly aware of information leakage and the potential for markets to move against orders. The trust relationship between the block trader and the buy-side is more important than ever as a result. It is this trust that is hard to replace with electronic block execution alternatives, and can be hard to replace in a time of changing sell-side headcounts.

To read more Block Trading content, click here.

Top 10 Takeaways From the FIX Regulatory Briefing

On the 20th November, a wide range of key stakeholders and market participants from across the industry attended a regulatory briefing at which two panels discussed upcoming American regulation and how it will affect the industry. Candyce Edelen of Propelgrowth compiled a top list of takeaways from the event.
Candyce Edelen
The FIX Trading Community (formerly called FPL) put on an excellent regulatory briefing event in New York Wednesday afternoon (Nov. 20th). The topic was emerging global regulations affecting the electronic trading community across all asset classes. I took away a number of interesting insights. Here are some highlights.

1. Shifting Responsibility for Market Oversight

An interesting point came up as the panel discussed some of the regulations now emerging globally. Responsibility for market oversight and the enforcement of orderly markets is shifting from SROs (self regulatory organizations) to market participants. Individual firms are expected to manage their systems’ compliance and integrity. This makes perfect sense, but now it’s being codified into the rules. Examples brought up included RegSCI, the CFTC Concept Release 2013, and the Hong Kong SFC Consultation Paper. In essence, according to the regulators, anyone who introduces disruption into a marketplace has responsibility. I expect to see more regulations of this nature in the future.

2. Regulatory Ambiguity Has Certain Advantages

The ambiguous nature of regulations came up repeatedly in the discussions. All panelists agreed that the rules are ambiguous by design. Generally, compliance officers like the ambiguity because it gives each firm the opportunity to interpret implementation based on their business model. But this can also lead to problems, because no one in the industry is certain how the regulators will interpret the rules until enforcement actions occur. When that happens, compliance officers around the industry get busy, reviewing the enforcement documents, identifying the issues, and testing their own firms’ policies and procedures to see how they would stand up against the same scrutiny. The SEC’s recent Rule 15c3-5 enforcement action was brought up as an example. In this case, the ambiguity made it tricky to predict how the regulators would enforce the rule. Now the industry has better insight.

3. Everyone Needs to Participate in Comment Periods

Across the board, there was strong agreement about the need for all industry participants to be more proactive in responding during the comment periods for proposed regulations. This is our opportunity to discuss the issues, and it’s critical that the regulators get input from all portions of the industry. Different contingents will have opposing positions, but that’s part of what makes the process work.  For example, RegSCI received 59 comment letters and 18 meetings with SEC staff. This is the kind of commenting every rule should receive. Incidentally, we have very little time left to respond to the 127 questions posed in the CFTC Concept Release 2013. Have your firm submitted responses yet?

4. Commenting Can Result in Delayed Implementation

Manisha Kimmel pointed out that every rule FIF has commented on ended up being delayed. So even if the comments don’t ultimately effect changes in the rules, it can help buy the industry more time for implementation. Panelists pointed out how delays have been helpful in allowing them more time to refine their implementation approach and be more prepared.

5. Last Minute Rule Delays are Problematic for Some

The panel discussed how often rule delays are not granted until a few days before (or even on the day) the rule was to go live. Delays may be beneficial for banks, but delayed implementations can have a very negative effect on vendors. One panelist pointed out that they’re planning their 2014 roadmap now. They have to guess at which rules will be required and when, so they can assign resources and plan development projects. Every time a new rule is introduced, affected vendors have to invest substantial resources to be ready on time. Last minute delays affect their ability to be responsive. In many cases, last minute delays force vendors to roll back a new release, directly impacting customers and demanding more time and resources. This affects their ability to provide other critical updates and deliver on other features needed by clients.
Banks also complained about last minute announcements of delays. For example, the limit up/limit down rule is expected to be extended to 4 p.m. by December 8. But many industry participants think the regulator will delay. So banks are struggling with whether their technology teams need to work over the US Thanksgiving holiday in order to be ready in time. If the regulator waits until after the holiday to announce, then teams have to continue to rush preparations.

6. Get Input from the Technology Teams During Comment Periods

Everyone on both panels agreed that regulators consistently underestimate what it will take to implement new rules. The consensus was that technology teams need to get involved in the process earlier. Generally, IT is not involved in the comment period, but they should be. They should be engaging with the compliance and business teams to evaluate what would be necessary to comply based on different interpretations of the rule. That will also allow commenters to provide a more informed analysis of what will be required for implementation, which also might give the regulators improved ability to estimate the time and cost impact for each new rule.

7. Technology and Compliance Collaboration is Critical

Regulations are almost always pretty ambiguous. All the panelists agreed that this is valuable. The ambiguity allows each firm to interpret the rule and customize implementation according to their own business model. But the ambiguity also leads to complexity in figuring out how to implement a rule. Compliance officers need to collaborate more with the technology team. Involve technology teams early in the process – having them review the proposed regulations and discuss what will be needed technologically to implement. Every voice impacted needs to be involved in determining the best approach. Not only do teams need to discuss technical feasibility, but they also need to understand what other things won’t get done while resources are dedicated to a particular rule implementation.

8. More Industry-Wide Collaboration is Needed

One of the panelists pointed out that there is no competitive advantage to implementing a regulation differently than the rest of the Street. Firms need the ability to customize their business models, but most want to ensure that their approach is similar to what others are doing. Our industry needs opportunities to collaborate and discuss regulatory initiatives. These discussions should involve compliance, risk management, trading and technology personnel. Apparently, there is some sort of forum for regulatory officers, but they don’t even know that the FIX Trading Community exists. But these compliance people generally don’t understand the technology issues.

9. Regulators are Hiring More Practitioners

One of the panels pointed out the trend for the regulators to hire more practitioners instead of lawyers. This is an important trend, and in my opinion, it’s about time! Regulating the electronic trading markets is a daunting task. Given the massive amounts of data and demands on processing and massaging the data to detect market manipulation, only practitioners with extensive market knowledge and quantitative skills will be able to detect the patterns in the data and expose violations. However, the regulators’ budgets continue to be an obstacle in hiring sufficient knowledgeable resources.

10. New Format, Excellent Panels

We tried a new approach this year. We reduced the number of panels, seated everyone at round tables, and had break-out round table discussions after each panel. While it made the planning more complicated, I really liked this approach. It allowed all of us to actively participate in the conversation and to add more voices to the discussion. We’ll be trying it again at the Americas Regional conference in February.
Overall, I think this is one of the best FIX conferences we’ve done in years. Kudos to the entire conference planning committee and to the FPL staff for pulling together such an engaging conference. It’s always an honor to work with such a great team of people in planning these events. Thank you for allowing me to serve as conference committee co-chair again this year.
 
This content originally appeared at http://www.propelgrowth.com/2013/11/22/top-10-takeaways-fix-regulatory-briefing/
For more information and to contact the author, please click here.

‘Alphatizing’ The Entire Firm – Part III

Mat Gulley, Global Head of Trading, Franklin Templeton Investments, is changing how his firm embraces alpha, and is making the PM, research desk, and trader a more integrated unit.
Mat Gulley_3The benchmarks
Lastly, for years we have put significant effort into trade cost analysis in hopes we could create metrics, insight and accountability into our process and across each desk and every trader. We have always believed it is impossible to manage what you can’t measure. Furthermore, to accept the notion that benchmarks and metrics were unattainable for buy-side traders was equally unacceptable to us. We have been working with multiple TCA vendors since my tenure began at Franklin Templeton (17 years ago) and have tried to establish metrics which we feel align with our investment process. We also hoped this analysis could become industry standard in order to give all traders the ability to be measured and recognised for their work with a better understanding of their performance and attribution to the investment process. I believe this is still a work in process for our industry. We needed our analysis to handle complex data sets that could be used on a daily and intraday basis and then ensure the results are driving strategy and behavioral changes.

To read Part I, click here.

We have used participation weighted price for almost 15 years. When we first started using it we called it the ‘value added benchmark’, but it was always a Participation Weighted Price calculation. We wanted a benchmark to align with the portfolio manager’s strategies and beliefs around their short-term alpha. We also wanted to allow for the traders to have some way to voice their short-term investment opinions and be able to measure these opinions. Of course, this requires mutual understanding and collaboration to incorporate the PM’s beliefs about short-term alpha and beta. Another challenge is that this requires a relearning of the traditional investment approach by portfolio managers and analysts, but I do believe it is worth the effort. Because there are many times when short-term alpha, liquidity or the market are not the PM’s focus or expertise, it makes sense to push this piece of the alpha triangle to the trader’s level. If tracked and measured properly this should allow the trader and PM to understand the trader’s performance value add to the decision process over time and thus should create a more objective measure of performance for the trader and the trading process.
When we started collecting the specific data around our computer-based trading/algorithm strategies, we had to develop a way to evaluate this data. This was done by taking several benchmarks such as PWP, IS and post-trade price reversions and creating an index. Once we were able to analyse our strategies, we put pro-active trading tools in place which allowed the traders to select the optimal back-tested strategies to use in their daily trading. Our efforts in TCA were really to measure our historical participation and execution performance within these exchanges and dark pools in order to dynamically change our strategies and capture opportunities.

To read Part II, click here.

This leads us back to the integration and collaboration conversation we continuously engage in with the managers. Do they know what you’re trying to accomplish and vice versa? In an ideal environment, there should not be much surprise around execution outcomes: here at Franklin Templeton, we like to say a best execution is a predictable execution. Once we have a conversation with a portfolio manager and the trader takes on the short-term alpha decision and the accountability for that trade, we want to know that those decisions clearly tie back to the original conversations with the portfolio manager. We want to know there is a clear understanding that the trader‘s job is to maximise the short-term alpha horizon and the PM has confidence this is being achieved and measured.
Ultimately, our goal is to have a 50 person global trading team that is constantly relearning, rethinking, evolving, and adapting to what is happening in the marketplace. We want to stress the importance of adapting to technological changes and proactively molding our own behavior as the world around us changes.
The future
There always has been and always will be more to accomplish as we aim to further integrate ourselves into the investment process with research and portfolio management. Buy-side trading, if structured and staffed properly, is generally under-utilised and unrecognised for its potential value. We don’t want to push too far into those distinct areas of investment expertise that delineates the differences between analyst, PM and trader; however a certain level of integration and collaboration is prudent. A properly skilled trading desk knows liquidity, market structure, trading dynamics, market information and should assume his/her role in the value triangle. There is a higher skill level to be achieved and there is an opportunity to think more broadly about maximising all of your resources. The investment process has to, by default, follow the path from research to portfolio management to implementation. I am convinced that there are opportunities for alpha in this area. Overlooking or misunderstanding the potential alpha opportunity that a strong trading team can capture is a direct cost to clients’ portfolios.

The Cloud : An introduction

INTRODUCTION TO THE CLOUD.

Lynn Strongin Dodds, Editor

Lynn_DSC_1706_WEBThe financial services industry is often at the forefront of technological change but they are only cautiously embracing cloud computing. While drawn to the flexibility and low costs on offer, concern over security and reliability persist. Providers are trying to allay their fears but investment banks and fund managers continue to tread cautiously towards the virtual world.

Marc De Groote, chief executive of Callataÿ
& Wouters, a technology firm that serves the financial services industry believes that cloud will gain momentum as “part of the overall outsourcing trend that is taking place across the industry. About half of the banks are outsourcing in some way
and I think people do understand the benefits and potential cost savings of using the cloud.”

In fact, a recent study by Centre for Economics and Business Research, a UK think tank, commissioned by data storage and solutions company EMC, predicts that 60% to 80% of all EMEA-based businesses in the banking, financial and business services sector will have adopted some form of cloud computing by 2015. Views are mixed though as to whether adoption rates will be that rapid.

Jon Milward, director of managed and support services at IT consultancy, Northdoor plc, believes financial services firm have to put cloud technology into perspective. “In many ways cloud computing does not offer any new functions in that access to storage, applications and servers are provided and managed by a third party versus in-house.” he says. “It is though a new way of delivering those services and companies have to understand that the cloud will not make all their IT headaches go away.”

The main benefits as with any IT outsourcing are well documented. Financial firms would not have to develop implement, maintain or upgrade technology. In addition, due to the on-demand nature of cloud computing, there is both scalability and elasticity. For example, if a bank requires surplus computing power or storage space due to an occasional spike in demand, it can look up to the cloud.

There are though several concerns, the first being the confusion over the definitions and the different terms that are bandied about such as – cloud computing, software-as-a-service (SaaS) and application service provider (ASP) which can be viewed as variations on a theme. A recent report by Celent defines cloud computing as the outsourcing of IT hardware – such as storage, servers and other infrastructure services – to a vendor that provides these services via the internet on an on-demand basis. Users therefore pay only for what they use and can call on these services as and when they need them.

SaaS, on the other hand, refers to the outsourcing of various applications, but not hardware. It may seem similar to ASP but Celent believes there are differences. The use of private networks and the constant customisation made the ASP model economically unrewarding, whereas the SaaS model is based on all users having access to the same application and via the same internet-based route.

The other issues, according to Milward revolve around availability, the legal contracts as well as the security of data stored and accessed remotely. “As with any technology such as Amazon or Google mail, there are outages and as a result, firms have
to look at whether they will be compensated if the service goes down. As for security, companies have to check whether they are allowed to have their data stored in a different location than their home market because some regulators do not allow this. Also, in terms of overall security, the best advice is to use a large provider with a track record.”

In fact, there are many vendors in the marketplace who have jumped on the cloud bandwagon and relabelled their offerings as cloud compatible. Stuart Berwick, chief executive officer of Singletrack Systems says, “We have seen several established IT vendors entering the cloud space positioning themselves as cloud providers to align themselves with this new trend. What
we are seeing is that banks and asset managers are gradually making the move with smaller firms making the move first. It is typically starting with applications like CRM [customer relationship management] and research management.”

Private versus public

These views are underscored by a report by Gartner which notes that cloud computing will evolve in stages emigrating first to what are known as private clouds and then on to public clouds over a period of years. A private cloud typically means that the data centres are still owned by
the corporate users and managed in-house. They take advantage of cloud-enabling technologies such as the virtualisation provided by VMware and the internet. By contrast, a “public cloud,” usually means that nearly all of a company’s information
is outsourced and managed by a third party that owns clusters of data centres.

Simon Watts, chief architect at DST Global Solutions, says, “Concerns over security are making asset managers, for example, extremely wary of public clouds but we are seeing an increase in the use of private clouds because they are ring fenced within an organisation. I think the usage will ramp up as organisations become more comfortable with the technology but to date most of the public clouds are being used by the consumer and retail sectors.”

In the financial community, the first movers have tended to be the smaller asset management houses that do not have the deep pockets to develop their in-house capabilities. Alternative fund managers are also jumping on the bandwagon. “We are seeing an increasing number of hedge funds and private equity firms who are using our services for fund raising,” says Philip Whitchelo, a vice president at technology group IntraLinks. “For example, instead of private equity firms sending out individual prospectuses to investors, they will take all the information about a fund and put it in a cloud-based service for investors to read. This makes the process much more efficient and cost effective.”

Public clouds, particularly those focused on information based delivery services are though gaining traction among the larger players. For example, Fidessa has enjoyed success with its Fragulator and Tradalyzer products. The former offers comprehensive global content with specific regional analysis on the US, Canada, Japan and Asia as well as Europe where it is pulling data from more than 40 platforms. The tool aggregates, normalises and analyses all of the data reported by dark pools, systematic internalisers and lit venues to the various trade reporting facilities.

The Tradalyzer which is based on this data offers customisable consolidated trading analysis across Europe’s fragmented markets. The Tradalyzer allows users to compare an individual trade against data consolidated by Fidessa who in turn assesses the execution quality of that particular trade and rates it against a set of widely used industry benchmarks. These include arrival price, last trade, interval volume-weighted average price (VWAP), 20% VWAP market open, market close, and previous close.

According to Steve Grob, director of group strategy, Fidessa, the services run on Amazon. com’s S3 (Simple Storage Service) and are data encrypted and protected. “Although the financial services industry has concerns over the security of the cloud, there are well-established practices for data integrity and security. If you follow those, then there is no reason why there should be a risk.”

Looking ahead, Jose-Antonio Martinez, managing director of Radianz & Payments, BT Global Services believes that the new world regulatory order will be one the key drivers generating demand for cloud services.

Traditionally, the applications accessible via the BT Radianz cloud have been focused on pre-trade, trading and post-trade usage but the firm has expanded its horizons into the compliance and governance arena. Users can tap into LockPath’s Keylight platform for governance, risk and compliance (GRC).

The platform provides US and internationally regulated businesses with a fully automated governance, risk and compliance data reconciliation and reporting process. It consolidates business critical risk and compliance data and provides a single point of control that aims to ensures compliance across the organisation

This was the first of several SaaS applications being made available through the BT Radianz. Customers can leverage the BT Radianz cloud to access the Keylight application, while having the option to simultaneously host data in any of the global BT Radianz Hosting data centres

“Meeting regulatory requirements is one of the biggest challenges facing the financial services industry”, says Martinez. “As a result, there is
an increased demand for automated solutions to manage governance, risk and compliance. Overall, I think we will see an increase in the use of cloud services but many companies may choose a ‘hybrid’ solution whereby some of their more standard applications will make use of the cloud, whilst others will be hosted in-house. Clouds that are able to interconnect easily with these in-house applications and the corporate networks and clouds they’re hosted on will succeed.”

 ©BestExecution 2013.

 

The Cloud : Standards

Ken DucatelA CLOUD MAP FOR EUROPE.

Migration to the cloud heralds a host of opportunities
for business, but without joined-up thinking regarding
the infrastructure, legislation and standards the hoped
for benefits might not be realized. Best Execution spoke to Ken Ducatel, of DG CONNECT* about the strategy decisions being hammered out in the EU to put Europe at the forefront of cloud adoption.

What are the key drivers in developing a single policy area for cloud computing (with particular emphasis on the financial markets)?

Our main goal is to increase the competitiveness
of the European economy, to facilitate growth and the creation of jobs in Europe. Today, this is not possible for the European cloud computing market. The EU’s cloud market is fragmented, as different rules apply in different member states.

This patchwork of different rules causes uncertainty, as businesses and individual users
are not sure about their legal obligations (for example they don’t know where legal disputes
will be resolved or how to make sure that it will be easy to move data and software between different cloud providers). They are not sure what standards and certificates they should look for to meet their requirements and legal obligations, for example
to ensure that their own or their customers’ data
is safe or that applications are interoperable. It is only natural that this uncertainty leads them to postpone the decision to adopt cloud solutions.

The new cloud computing strategy** will help Europe make the most of the enormous potential offered by the cloud. According to new estimates, cloud computing revenues in the EU could rise to nearly €80 billion by 2020, if policy intervention is successful (more than doubling the growth of the sector). So this strategy is about building a new industry, and better competing in the global cloud market. More broadly, we expect a net annual gain of €160 billion to EU GDP by 2020 (or a total gain of nearly €600 billion between 2015 and 2020) if the full EU cloud strategy is in place. Without that, economic gains would be two-thirds less.

These benefits largely come from businesses being able to either save money or get access to technology that makes them more productive. In terms of overall job numbers, we expect to see 3.8 million jobs generated following full implementation of the strategy, against 1.3 million if the regulatory and other policy barriers are not removed.

In terms of policy, what kind of framework
do you envisage for the European cloud computing strategy?

We want a business framework that will speed
up the take-up of cloud computing across EU economy. To do this, we need to make Europe not just cloud-friendly, but also cloud-active. And this can be achieved by means of three key actions:

We plan to cut through the jungle of technical standards. Standards are emerging, but at the moment there is no common agreement as to which standards would guarantee the required interoperability, data portability and reversibility. That is why we want to identify coherent sets of useful standards to make it easier for the demand and supply sides to organise themselves.

We want Safe and Fair Contract Terms and Conditions to address issues not covered by the Common European Sales Law such as: data preservation after termination of the contract, data disclosure and integrity, data location and transfer, ownership of the data or direct and indirect liability. Identifying and developing consistent solutions in the area of contract terms and conditions is a way of encouraging wide take up of cloud computing services by increasing trust by consumers.

We are promoting common public sector leadership through a European Cloud Partnership (ECP). This Partnership will bring together public procurement authorities and industry consortia to implement pre-commercial procurement actions. This will allow them to identify public sector cloud computing requirements, to develop specifications for IT procurement, and to procure reference implementations. They will thus be able to advance towards common and even joint procurement of cloud computing services by public bodies on the basis of common user requirements.

Are security and privacy the main challenges, and if so what can be done to allay the fears?

Security is one of the main challenges of cloud computing – though we should bear in mind that risks in the cloud do not necessarily differ much technically from already known risks. However, in the cloud, these risks can amplify quickly and easily, and this increases the effects on multiple clients simultaneously. Cloud-specific security risks relate to the multi-tenancy and shared resources character of cloud computing. They are related, for example, to access control, data storage, data protection, data portability, data integrity and virtualisation. In the cloud, the user cedes control of the security to the service provider; it is thus more difficult for the user and the provider to agree on sufficient assurance levels for the service. This is probably why the cloud is perceived to be less secure.

Privacy risks also relate to the perceived lack of control and to unclear roles of different actors in the cloud service provision chain. In the cloud, personal and other sensitive data can be stored and processed anywhere – also outside the European Union. This in turn creates confusion about the applicable data protection rules.

What we are doing about this is work very actively to increase the security of IT systems.
The European Commission is collaborating
with industry and with the European Telecommunications Standards Institute (ETSI) to identify a detailed map of the necessary standards and to work with the support of the Network and Information Security Agency (ENISA) to assist the development of EU-wide voluntary certification schemes in the area of cloud computing. The Commission is also preparing a European Strategy on Cyber Security that will also encompass cloud security. In the field of data protection, we already have proposed a new Regulation which aims to ensure the high level of data protection and addresses issues raised by the advent of cloud computing.

Would simplifying the legislative minefield help?

We are in the course of assessing whether current security legislation needs some streamlining
as regards cloud providers. For example, their obligation to notify severe security breaches is being assessed. But amending legislation is not always the best course to follow. For example, standards are voluntary; therefore it is up to market players whether they will endorse them or not. In cases such as this, creating trustworthy certification schemes is a faster and more effective way to increase security of cloud services than legislation.

What are the challenges to delivering a co-ordinated cloud infrastructure in Europe?

The cloud computing strategy does not foresee the building of a dedicated hardware infrastructure to provide generic cloud computing services to public sector users across Europe. We simply want to see publicly available offerings of cloud- based services that meet European standards not only in regulatory terms, but also in terms of being competitive, open and secure.

This does not preclude public authorities
from setting up dedicated private clouds, but in general even cloud services used by the public sector should – as far as feasible – be subject to competition on the market to ensure best value for money, while conforming to regulatory obligations or wider public-policy objectives in respect of key operating criteria, such as security and protection of sensitive data.

What role do funding, research and innovation play?

We want European companies to remain at the cutting edge of technological development and innovation. That is why we have established a vast portfolio of research activities on cloud computing, alongside policy actions. Our budgetary commitment to cloud computing-related research is currently around €400 million and it’s funded through European research programmes.

Moreover, we will try to make full use of other available research and development instruments offered by Horizon 2020*** to tackle long-term challenges specific to cloud computing.

Why does Europe lag behind the US? Is it because there are varying local accounting methods and business practices and,
the absence of harmonised rules for the technology?

In Europe our biggest potential weapon for increasing competitiveness is the single market, and if we could get to a vibrant Digital Single Market then I believe Europe would be able to lead the world in technology and services. Too often businesses are stopped at borders which in the online world don’t make sense any more. Many of the actions announced in the Digital Agenda are addressing these problems in respect of Digital Content, e-Commerce or Data Protection and in order to promote cloud in Europe we need to reinforce our efforts in these fields.

What does it mean to be cloud active?

Being cloud-active means creating the conditions that will allow us to make the most of the potential offered by cloud computing. It means we are not standing still, waiting in the shadows for others to lead the way. It means taking initiatives that will open new paths in cloud use in business, research, education, and culture.

That is what we want to achieve with this strategy. We know that in Europe we have the talent it takes to be world leader. We want to allow this talent to flourish. We want it to bring growth and jobs in Europe for us and for the generations to come.

 

*Ken Ducatel is Head of Unit for Software & Services, Cloud Computing, for DG CONNECT. As of 1st July 2012, the Digital Agenda of the EU is managed by the European Commission Directorate General for Communications Networks, Content and Technology, or DG Connect.
 
DG Connect – “The name represents the range of topics where we are active, and our structure aligns the work of the DG with key EU policies for the coming decade: ensuring that digital technologies can help deliver the growth which the EU needs. We work for the Vice President of the European Commission responsible for the Digital Agenda, Neelie Kroes.”
 
**This document can be downloaded from: https://ec.europa.eu/information_society/activities/cloudcomputing/docs/com/com_cloud.pdf
 
***Link to: https://ec.europa.eu/research/horizon2020/index_en.cfm.

 

 

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA