Challenges around the cloud have remained fairly consistent. But, in the context of capital markets IT, one of the primary concerns is data.
This is an extract from a paper looking at cloud technology in the capital markets, available here.
When working with the data types consumed and produced by financial services organisations, the following must be considered:
Some providers operate a policy where it's free to load data into the cloud, but you are charged for taking it back out again. This means that once a firm has uploaded sufficient data, they are effectively locked into their provider as the price of moving the data out is prohibitive.
The greater the mass, the tighter the lock. This is why we refer to it as 'data gravity'.
A number of firms are already in this position and have stated that it would take ten years to recoup their spend if they were to move their data now. It’s just not as simple as deleting and starting again either – what if this data has some regulatory significance and a 5+ year retention policy?
On the topic of data storage, but with wider ramifications, firms must also consider the location of the public cloud datacentres in which their systems would be hosted. Are they where you think they are? Do their locations impact your regulatory and trading responsibilities? Will your systems and applications perform as required?
Another important and often under-appreciated consideration is the amount of control and significance a client has with a public cloud provider. Let’s say, for example, that you are a tier one investment bank with a large-scale deployment with a well know public cloud. The reality is that you are insignificant compared to firms like Netflix and Spotify, so don’t expect preferential treatment. It's a reality that investment banks are not necessarily used to experiencing.