BIG DATA AND HFT
Nowhere is the big data issue more relevant than in the high frequency trading sector. As part of the Mifid II requirements, trading firms must have effective systems and controls in place to ensure resilient trading systems, sufficient peak capacity for peak order volumes and safeguards to prevent erroneous orders. Some have seen the requirements dictated by oncoming regulations as an attempt to curb the high frequency trading sector. But through efficient and intelligent IT infrastructure and data storage, algorithmic traders can use the huge volumes of regulatory data required to their advantage.
Many are turning to cloud computing or a hybrid of cloud and remote, secure, privately-owned servers – the latter giving the ease of access and data capacity of cloud servers with the security of private servers.
KEEPING DATA RELEVANT
With most algorithms having a life of two months or less, it is important that any data recorded is made use of quickly and efficiently and that data is stored and accessed accurately. And as Simon Garland puts it: “If you think that you’re doing just enough, you’ll get caught short.” He says that in the past, you could get away without rigorous back testing of systems. But now that everybody is using every byte of data to hone their systems, you have to do the same just to keep up with the rest.