Turning regulatory big data requirements on their head

ONE of the biggest changes that new regulations will bring in is an exponential increase in data storage requirements. And these volumes of data should not just be treated as a simple necessity for regulatory compliance. Data needs to be stored and it needs to be accessible. “If you don’t use all the data available, one of your competitors will,” says Simon Garland, chief strategist of KX Systems. Using just enough data is no longer enough. “In the past, clients would not be worried about throwing away terabytes of data,” says Phil Read, operations director for Cisco. But now there is a demand for every piece of data available as clients look to monetise “big data”. And this is where smart IT infrastructure plays its part. “We’re looking to new types of computing platforms and architecture to achieve this, along with data analytics systems,” says Read.

Nowhere is the big data issue more relevant than in the high frequency trading sector. As part of the Mifid II requirements, trading firms must have effective systems and controls in place to ensure resilient trading systems, sufficient peak capacity for peak order volumes and safeguards to prevent erroneous orders. Some have seen the requirements dictated by oncoming regulations as an attempt to curb the high frequency trading sector. But through efficient and intelligent IT infrastructure and data storage, algorithmic traders can use the huge volumes of regulatory data required to their advantage.

Many are turning to cloud computing or a hybrid of cloud and remote, secure, privately-owned servers – the latter giving the ease of access and data capacity of cloud servers with the security of private servers.

With most algorithms having a life of two months or less, it is important that any data recorded is made use of quickly and efficiently and that data is stored and accessed accurately. And as Simon Garland puts it: “If you think that you’re doing just enough, you’ll get caught short.” He says that in the past, you could get away without rigorous back testing of systems. But now that everybody is using every byte of data to hone their systems, you have to do the same just to keep up with the rest.