Resources

Next Gen Performance Measurement: data management

Date: October 6, 2016

Web

Ask any middle office team what keeps them awake at night and, nine times out of ten, they will cite one overriding issue: data management.

Without good data management even the smartest of systems will struggle. Trading volumes may still be at a similar level to five or even ten years ago, but the broader range and complexity of asset classes – plus the need for more granular reporting – means there are more facets to the data and more variables.

In short, there are more things that can go wrong. And if that’s not enough, teams are under increased pressure to process the data at a faster rate of knots.

Absolute Quality Essential

Focusing on the time it takes to process the data is certainly relevant, but only part of the story. Speed is important as long as the right checks and controls are in place to scrutinize the source data and the calculated data at every point in the workflow.

{{cta(‘bcdd0f82-15ab-4a31-97d7-953aaeb085d7’)}}

That’s to say, the data needs to fire out of the starting blocks at bullet speed, and still remain bullet‑proof when it reaches its final destination – no easy task when thousands of portfolios are racing down the line.

We’ve quoted Ellen Shubert, Chief Advisor at Deloitte Investment Management, in an earlier report and her comment in the firm’s 2014 Alternative Investment Outlook is worth another look:

“Data is the hottest topic by far. Every meeting I go into right now is about data – the amount of data that funds have to retain, manage, manipulate, and massage for their portfolio managers, their investors, their regulators, and the entire company.”

Recent research by the Economist Intelligence Unit on behalf of State Street quantifies this. Survey responses found that 93 percent of institutional investors view data and analytics as a high strategic priority. For 35 percent, data and analytics is the most important strategic priority.

The fact that data management is the number one issue is not a surprise given its complexity and the rate at which investment vehicles have multiplied over recent years. Controlling data with its inherent variables and complexity is thus a key part of any middle office’s remit.

Controls on the source data mean that errors are less likely to occur in the first place. Advanced data management checks and controls also mean any issues can be easily identified and resolved within the system, as opposed to creating an external manual workaround.

This makes for better audit control and compliance, and boosts managers’ ability to revert to previous sessions and versions should a rollback of changes be required. Overall, staying within the system allows for faster resolution and for more certainty when it comes to checking what has happened with the data, when and why.

Next generation performance measurement also means errors can be identified at source. So, rather than having to stop an entire workflow across hundreds of portfolios, the system can work to quickly rectify the problem in hand, without the entire process being temporarily halted unnecessarily.

Industry response

The research by the Economist Intelligence Unit illustrates how the market now recognizes data as a real competitive asset. Some 88 percent of traditional asset managers said they had increased investment in data infrastructure in the three years leading to 2013. And 66 percent believe data will be a key source of competitive advantage in the future.

Being able to improve the quality of the data coming in and going out of the performance measurement process is adding value overall. It means managers can explore new ways to increase their effectiveness or become operationally more efficient. In a commercial environment where returns are not the only way to add value, this focus can make a real difference.

But for it to happen a new approach to systems is required. Legacy systems that silo data across several platforms need to go. A system that takes some source accounting data, crunches the performance and spits out some numbers at the end of the day is no longer fit for the job.

A more sophisticated system where users can make their own tweaks and correct errors for themselves is the way forward. Next generation platforms can do this as they are cloud-based and rely on a single version of events using the same data set for all processes. This means that there are no issues around data getting out of sync or abnormalities leading to problems that cannot be identified and corrected easily without holding everything else up.

The authors of an academic paper – Understanding data management in asset management: a survey – summarize the issue like this:

  • Advances in computational power have paved the way for harnessing complex algorithms for the analysis of operation and condition data
  • Advances in database technology has allowed huge volumes of data to be collected and processed as well as spurring on the advent of the data warehouse
  • The Internet has brought the benefits of information sharing and accessibility to the fore, and corporate system integration and workflow management is now being addressed in current research.

The demand for more accurate and timely performance data is only going to increase as the industry places more importance on the downstream analytics that follow.

Step one is daily performance data that is simple and efficient to produce. Fast enough for the front office, and controlled and precise enough for the middle office. Data management issues need to be tamed and the right technology is essential.

Takeaways:

  • Without excellent data management even the smartest of systems will struggle.
  • The need is for timely and accurate data for teams to work from within a shorter time window.
  • Controlling data with its inherent variables is a key part of any middle office’s remit.
  • High quality daily performance data helps bring the front and middle office closer together and is essential for high quality analytics.


{{cta(‘fa3b4f0d-4c30-4f05-8a1f-f9f2febb3edb’)}}