Resources

Apples and Apples: how consistent data is finally within reach for large asset management firms

Date: July 19, 2016

For obvious reasons, many of the largest asset management firms struggle with massive data volumes, silos and processes. 

Reconciling the data can sometimes feel like searching for two identical needles in a giant haystack.

WebIn other words: very time consuming and expensive.

This is why large firms need a joined up and single view that allows for greater overall visibility when it comes to what happened, why and at what cost. Having that view relies on consistency when it comes to data intake, data analytics and processing.

Traditionally, this has felt like a capability that one could only ever dream about. Asset management analytics and reporting always felt destined to be a game of apples and pears: until now.

Single source

To achieve consistency, managers need a system that can take in data from multiple sources and normalize it to allow for a consistent and comparable output on both performance, attribution and risk. To this end it’s essential to have the right processes in place to be able to check for date, format, identifiers, tags and other information that highlightsthe right data.

{{cta(‘3e429e2e-f90b-4c10-b3b0-121e18cdef7c’)}}

Using the same data source for the same activity, having the same timeframes and other indicators means the end result is a like-for-like analysis. Happily, the most recent systems and technological advances mean that data consolidation and consistency is now a realistic possibility and silos can finally be assigned as a thing of the past.

Single normalization and aggregation

Once the data has come into the system the next step is normalization and aggregation. Robust risk analysis combined with performance attribution relies heavily on data aggregation and data consistency to allow that aggregation. In its 2016 Outlook report, Citisoft notes that understanding aggregated risk across all investments and portfolios is coming increasingly to the forefront:

“Changes in how managers want to analyze markets and construct portfolios will begin driving firms to seek more integrated and robust solutions, enabling asset managers to take advantage of the benefits of expanded risk strategies. We believe that investment risk management initiatives will continue to be a major theme for 2016.”

Single analytic process

Consistency of the analytics process is next. For the first time, well-known and respected performance, attribution and risk models can be combined on the same platform, so that the breadth of analysis is readily available and users can pick their preferred model.

Because the various models are already understood – using the same historical data set – potential variation is reduced significantly (or entirely), resulting in a more accurate view using known data and known methodology. If managers across both front and middle office are using like-for-like data sets that have been well aggregated and normalized, then any decisions made on the analysis that follows are going to be more consistent and informed.

Single version of events needed – and quickly

Nevertheless, as always the devil is in the detail. Without having consistent data sources, normalizing, aggregating and processing the data using a known and adhered to technique is close to impossible.

Although it would be possible to interrogate performance and risk analytics for individual asset classes, the overall picture would still be seriously disjointed and unclear.

All this relies on automation to reduce error and accelerate timeframes. The requirement to automate processes and anticipate intraday risks is great and the greatest opportunity to reduce risks and unnecessary expenditure is through the elimination of manual workarounds. Manual rectifications and resubmissions are both timely and expensive, especially when regulatory risk reporting is concerned

The fundamental point is that managers cannot afford to make mistakes through inconsistent data intake, processing and output. The whole system needs to be clearly defined and consistently adhered to so that it can be fully automated to achieve quicker results and response times.

The end result is a system that can deliver a detailed daily view of how the money was made, how this compared with what was expected and what risks were involved in each investment decision.

This is by no means a linear process and the capability to generate an accurate picture leading to much better insight into investment strategy on a day by day basis.

Takeaways:

  • Consistency is key when it comes to data intake, data analytics and processing.
  • The end result needs to be meaningful. Managers need to compare like with like.
  • Data intake comes first – a single source of data with reliable parameters.
  • Aggregation and normalization is next – can the system see this data for what it is?
  • Process consistency and applying a known technique gives the output.
  • Housekeeping when it comes to intake, normalization and process make greater automation possible.
  • Automation means faster processing with fewer mistakes and allows for quicker decision making.


{{cta(’97c615a5-1fe8-4298-9ef6-4f15e8a59bdb’)}}