Solving for Solvency II

Date: June 29, 2015

The story behind Solvency II is no different than the story behind most European Directives since it came as a response to a major failing within the sector, namely the near collapse of American International Group (AIG), the world’s biggest insurer, and the subsequent $182bn bailout by the American tax payer.

AIG’s downfall exposed the weaknesses of the global insurance world, and has pushed the Financial Stability Board (a global body who monitors and recommends on the global financial system) to assess insurance firms and to label nine of them as Globally Systematically Important.

It was the AIG Financial Products’ unit in London and its practice of writing Credit Default Swaps (CDSs) against credit instruments that led to the near collapse, which explains the European focus to increase the supervision of the industry. As a result, Solvency II was conceived and born to introduce new methods of oversight that ensure all risks an insurance firm is facing are identified and managed appropriately, including the investment risk that is often delegated to an external asset manager.

Typically, new regulation research at Confluence focuses primarily on any regulatory developments (whether it is new regulation or reform) for the asset management & fund administration industry. Examining Solvency II has been a deviation from the norm as we look to solve a new form of regulatory problem for our asset manager client– the need to be compliant for their client (who is the regulated entity).

From 1 January 2016, when the Directive will enter into force, an asset manager who is running money for an insurance firm will be required to respond to requests from their insurance clients on a quarterly basis for a look through to investment holdings data. This data then feeds into the insurance firm’s calculation of their eligible capital resources and helps them assess their solvency capital requirement (SCR). The better quality the look through, the more accurate the assessment, which can lower the SCR making it critical that the asset manager gets the reporting right.

To help asset managers accommodate the new data reporting, three industry bodies in France, Germany and the UK, have compiled a template for the data required to assess SCR using the standard formula (a risk–sensitive model designed to calculate all the risks an insurance firm may be exposed to).

This uniform template is a great foundation for building a process around the data requests, but it isn’t that simple – the devil is in the detail. The template specifies generic categories of data to be generated from the client portfolio position file such as; Portfolio characteristics, Instrument codification, Valuations and Exposures – all fairly straightforward. But then it moves into more complex categorisation looking at Transparency, Indicative contributions to SCR at instrument level (calculations for up and down shocks for each risk factor) and finally, additional and specific information such as identifying any under-written instruments, rating of counterparties / issuers and data on convertible bond floors and holdings. This information feeds through to the Insurers’ Quantitative Reporting Templates (QRTs).

The interesting point to note here, and what drives our interest in this regulation is that the complex categories of this proposed uniform template mentioned above are optional therefore leaving it to the asset manager to decide how much or how little they assist their insurance client with the data reporting challenge.

It may be that this is a result of the template not being enforceable after all, the asset manager is not actually accountable to the regulation. However, the optional elements are clearly where the ‘value add’ lies for client servicing so if the asset manager wants to be a partner of choice within the insurance industry, they need to solve for this data analysis and data management problem.