Ask any asset manager about their dream system for the middle office and the response would typically include three words: fast, accurate and efficient.
With this functionality in place, the daily workflow can be streamlined allowing teams to focus on more interesting and valuable activity around performance analytics. In an ideal world, the right checks and balances also reduce the time spent on data corrections, recalculations, manual workarounds and any other time-consuming task that can lead to bottlenecks and delay.
This is an area that Ian Thompson, Head of Product Development at StatPro has focused on in recent years in order to build a new performance measurement platform from scratch. The result is StatPro Revolution Performance and, prior to its official launch in London on 28 September, we caught up with him to explore the issues.
Here is an edited version of his comments in conversation:
100% accurate… but faster
“The kinds of areas around performance that we have been talking about over the last number of months is how teams need to work faster and more accurately in order to manage today’s investment environment,” says Thompson.
“Trading volumes haven’t necessarily increased but the complexities in the data means that it is more granular with more facets to take into account. We are also very aware that teams are still under pressure to turn everything around quickly.”
But granularity, complexity and a huge number of variables do not mean that speed is not still of the essence.
“Every client has a different number of portfolios calculating different types of returns over different time period and different levels of transactions. The latest technology means we can scale up to hundreds and hundreds of servers that can theoretically process the data inside 60 minutes rather than the many hours it currently takes.”
Flexibility in the Cloud
“Even for really large asset managers and asset servicing firms, it is possible to process all these portfolios very quickly using cloud based applications that automatically upscale so that the manager does not lose speed,” Ian Thompson comments.
“This ties in with what managers want; reduced turnaround times so managers can process the data and send the results out to their clients more quickly.”
“Doing this within one or two days rather than one or two weeks after month end clearly adds real value. And if anything needs to be reprocessed then that too can be turned around much more quickly.”
“The scalability and the speed which comes from the Cloud is inherently attractive,” he says. “People love the idea that the Cloud can be auto-scaled to deal with whatever processing needs they have.”
Control is as important – checks and balances are key
Thompson says that scalability isn’t just about improving the speed of calculation, it also means ensuring the accuracy and availability of results and their subsequent consumption.
“Systemic control is essential, which is why we have developed over 80 data control checks on the source data, all configurable by clients, to take care of this particular issue.
“We have also included advanced data management controls to easily identify any problems and then resolve issues in a timely fashion. It’s about being able to get to source problem within the controls.”
Thompson cites Eight top expectations from a Performance Measurement and Attribution System, by Saurabh Kumar.
Here Kumar states that any system should be: “Good enough to flag some preliminary data quality issues like missing data or break in reconciliation. The system should also have a robust audit management functionality. Even the smallest of change made on the system should be captured in the audit log and these logs should be accessible with the touch of a button on the GUI.
“Error message is one of the most underrated features of a calculation engine. In a Performance Analytics platform, the error/warning message should flag bad records or missing data points precisely. If the end users can’t decipher any critical error message quickly from the process status, then that message is redundant for any type of problem assessment in the system.”
Data controls means x going wrong does not stop y going through
Being able to process part of a workflow without the whole job stopping is something else that people need. There are always going to be dependencies between each of the steps of the workflow and portfolio activity needs to pass all of the controls without any errors before going into the actual performance calculations.
But, says Thompson, if there is an error for that one portfolio, it shouldn’t stop any of the other portfolios within the group from going through, right through to the end of the process.
“All this comes down to having fast performance which is functionally complete but with the data controls and data management and workflows, so there is no need to produce manual workarounds outside of a system.”
“People love having the ability to be able to easily navigate around a system – to be able to quickly get to the source of the problems and to resolve them. They love the fact that they can have all their controls in one place rather than having to build processes around systems, to identify any data issues that they have.”
“It’s these things that make it possible for the performance team to have a more productive day,” says Thompson.
“It’s all about making the complex simple so performance analysts can concentrate on what they are really meant to be doing: working with the clean data to deliver exceptional analytics.”
- A dream system is one that is both fast, accurate and efficient.
- The scalability and the speed which comes from the Cloud is where more and more asset managers are starting to focus their attention.
- Accuracy relies on data controls and data management and workflows, eliminating the need for manual workarounds.
- Saving time for the performance analyst means they can concentrate on the more value added areas of their job.