# Proactively managing volatile market data with the rule of three

Data volumes have spiked on the back of increased market volatility, and many systems across the front and back offices have struggled to keep up. KX explores how financial institutions can optimise data management and analytics platforms to cope with a constantly changing investment landscape

In 2019 a common complaint among FX traders was: ‘Where has all the market volatility gone?’

As if to answer this grumbling, the Covid-19 pandemic emerged, disrupting supply chains and forcing many central banks to row back on monetary tightening. Volatility returned with a vengeance and, so far, shows no signs of abating.

This was good news for many FX traders, who were able to profit on the back of these wild currency swings. It was less welcome news for other market participants, such as corporates, who were forced to re‑examine their currency hedges to ensure they were adequately protected.

Amid this market chaos, many dealers have been doing well from increased client demand to profit from or protect against violent market movements. For example, according to data from Greenwich Associates, global pool revenue on US dollar options trading jumped more than 50% between 2019 and 2020 on the back of market volatility (rising from $3.4 billion to$5.3 billion). Revenue in 2021 (\$4.6 billion) was still at elevated levels.

However, while the return of volatility to the FX markets may be welcome news for some, it also means sensitivity to risk is much higher.

The consequences of putting an FX trade on at the wrong moment can be more severe in volatile market conditions. Moreover, dealers must be more careful over their hedging strategies. This was highlighted earlier this year when sharp moves in the Japanese yen against the US dollar left some FX options dealers nursing their wounds and scrambling to re-hedge.

The upshot of this is that it is becoming increasingly important for dealers to ensure the information on which they base trading decisions is as accurate and as dynamic as possible.

“Traders are increasingly mindful of the fact that, for them to be competitive, they have to remain responsive to changing market dynamics,” says Harry Darrell‑Brown, product manager of financial solutions at KX, which provides time series database technology to financial institutions worldwide. “If they aren’t tech-savvy and don’t move with the times, they can quickly be replaced.”

KX’s data platform, kdb+, allows firms to store, analyse, process and retrieve large data-sets at speed. It has proved particularly popular with high-frequency traders, but has also been deployed in other time-sensitive data applications, such as the energy markets, telecommunications, monitoring equipment used on manufacturing sites and even Formula One racing.

### The rule of three

Data gathering and data analysis have long been within the purview of technical back-office departments, but this has resulted in an inefficient understanding of the information being captured.

Steve Wilcockson, product marketing manager at KX, says that proactively managing volatile market data is all about what he terms the rule of three.

“It’s about having the right analytics for the right group at the right time,” he says.

It’s also about being able to effectively tailor data needs to meet the requirements of different users.

“You have the quants, who design the system; the data engineers, who build it; and the traders, who use it. Each has a keen interest in data analytics, but all from a slightly different perspective. The platform needs to be able to take care of each of these divergence interests,” says Alex Weinrich, FX and crypto solutions specialist at KX.

Gathering the data on a single platform and providing a user-friendly interface to run analytics allows the whole firm to gain a valuable insight into the daily trades being run.

“Capturing the fluidity between these three different roles is absolutely the right thing to do,” says Wilcockson. “Each participant in the group must be able to work together to build, analyse and understand the lifecycle of a trade.”

This firm-wide buy-in is crucial for the successful deployment of the data analytics engine. “Everyone at the firm – from traders to senior management – needs to realise the competitive edge this technology gives them,” says Darrell‑Brown. “If they don’t embrace this new way of doing things but their competitors do, then they are quickly going to be overtaken and will likely be replaced. No one wants that to happen to them.”

### Data optimisation

Processing and managing large volumes of such data is slow and time-consuming. This can lead to increased costs, reduced agility and, ultimately, uncompetitive pricing. While each data-set may hold important insights – upon which trading decisions can be made – the real power is unlocked once this data is brought under a single roof.

Unfortunately, in many organisations, data still exists in a range of different formats across multiple locations. This is typically a legacy from the way the data was captured in the first place.

Some firms have sought to get around the problem of dispersed data-sets by sampling down the data or producing cached calculations, before performing analytics on the data. But this is far from optimal and undermines firms’ abilities to have real-time data at their fingertips.

Analytics needs to be performed at the point at which the data sits, says Weinrich.

“In many cases, firms move the data to a different location before they perform analytics on it – which brings latency into the picture,” he says.

Seamlessly combining historical data with real-time data can also yield more valuable insights.

“All large banks and financial institutions use predictive machine learning these days. This means incorporating historical client behaviour into data algorithms, and adjusting models based on how a particular set of clients or liquidity providers is likely to behave following a market shock,” says Weinrich.

Combining different data-sets in this way allows firms to use the same code during development as they will use when the system is up and running.

“If the code being executed in production or the data is different from the real world, then how do you know what you’re testing is valid? So it has to be the same,” says Darrell‑Brown.

Pre- and post-trade analytics with real-time responsiveness to market change is absolutely vital to the successful integration of any data platform.

“With the FX markets in such a state of flux, traders must be able to hedge their portfolios and make sure they get those FX hedges right,” says Wilcockson. “To this end, they need the ability to quickly recalibrate models to account for sudden shocks to markets.”

This means being able to generate real-time insights at the end of trades – not just at the end of the day. Transaction cost analysis, liquidity and venue analysis and execution analytics are all key to this.

“It’s not just about having the right tools – it’s about being able to use them effectively,” says Wilcockson.

With no signs of market volatility letting up soon, it’s also about being able to future-proof the data going forward.

“Optimised cloud storage, combined with exceptional compression capabilities, can provide a highly cost-effective alternative to on-premises data storage,” says Richard Kiel, global head of FX solutions at KX.

An important part of using the cloud is connectivity. Not only do firms have to be able to access the data as quickly as possible, they also need to be able to do so in a convenient way.

“People need to be able to easily incorporate the cloud-enabled platform into their workflows, using languages and capabilities they are already familiar with,” says Wilcockson.

Typical tools that data scientists and engineers already use include Python, Jupyter notebooks and structured query language – and any new cloud-enabled platform needs to be able to seamlessly interact with these.

“This democratises who can get access to the platform straight away and continue using it – not just today or tomorrow but next year and well beyond that,” says Wilcockson.

### Beyond finance

While KX’s cloud-enabled platform has proved an invaluable addition to the trading teams of many large banks and fund managers, its ability to efficiently handle very large amounts of data means its application extends well beyond the financial world.

“The deep insight you need to have for electronic trading is the same insight you need to improve efficiency and automate processes across a variety of industry sectors,” says Darrell‑Brown. “Again, this comes down to having complete, accurate and timely data, from which real-time insights can be easily extracted.”

One of the most intriguing examples of the platform being used outside of finance is in the world of Formula One motor racing.

“If you’re in a race situation and your competitor comes in for a pit stop, you need to be able to judge whether you need to adapt your tactics to compensate for this,” says Wilcockson. “Timely data with predictive algorithmics can provide this insight.”

Another successful use case of the technology has been in the supply and delivery of commodities such as natural gas, which combines timely data with predictive analytics to spot current supply-and-demand trends.

“The point is that our real-time analytics platform doesn’t have to be confined to finance,” says Weinrich. “Whether for trading analytics, predictive health, predictive maintenance or predictive motor sports, the ‘power of three’ still holds: the right analytics for the right group at the right time, tailored to that winning combination of modeller, data engineer and system user.”

You need to sign in to use this feature. If you don’t have a FX Markets account, please register for a trial.