Conditional review of a common theme

Ian Craig

or Subscribe to Feed

Five years is a long time in anyone’s book and, as a rule of thumb, it either calls for celebration or a period of reflection.  In the case of the Pension Regulator’s Detailed Guidance on Record Keeping, published in June 2010, I would suggest the latter.

The parameters were clear: trustees were to ensure that by December 2012, 100% of members had a full set of common data for entries post June 2010, with the standard set at 95% of members for pre June 2010 entries.  Behind that, the conditional data – data conditional on a number of factors, such as scheme design, a member’s status in the scheme and their respective individual events – was given a more ambiguous target.  The emphasis was on trustees to be aware of their conditional data but not necessarily to have taken steps in rectifying any issues.

As a means of communicating progress, tPR published their record keeping survey in 2013 and a follow-up in 2014.  Despite an improvement in the scores from 2013, tPR found that many schemes were still falling short of the standards set for measuring data quality and maintaining accurate records. It wasn’t surprising, however, that larger schemes with money to spend had a better quality of data than those at the smaller end of the spectrum.

Market developments

It cannot be understated that good record-keeping is key to running a pension scheme effectively.  This target-setting is dovetailed with other developments in the pensions industry over the last five years.  In each case, data quality plays a central theme:

  • the increase in the de-risking market;
  • the roll-out of auto enrolment;
  • tPR’s publication of the 31 DC quality features;
  • the announcement that contracting out for DB schemes will cease in 2016.

With de-risking, it goes without saying that accurate data is crucial in obtaining the most competitive – or, in some cases, any – price regarding buyouts, buy-ins or longevity swaps.  It is perhaps interesting to note that tPR’s 2014 survey highlighted a greater increase in conditional data scores, which may well tie-in with the work involved with schemes seeking to de-risk.

Auto-enrolment itself brought a new slant to proceedings.  Employers were tasked with tracking employees who could become eligible jobholders thus being auto-enrolled.  In addition, there were the administrative aspects of the recording of those who chose to opt-out, the payment of their refund of contributions and the process for re-enrolment after a three year period.

In the DC world, the minimum governance standards have been effective from April 2015.  Its importance was reinforced earlier this year with a high profile announcement that, on some schemes, lifestyle strategies had not been correctly applied.  When a problem arises on a DC scheme, it is particularly difficult to untangle and rectify, which emphasises just how important maintaining a high quality of data is.

Lastly, the end of contracting out for DB schemes will lead to a significant amount of data rectification being undertaken.  It should also be noted – and, anecdotally, appears to have been missed – that the end of DB contracting out does not simply mean GMP reconciliations across the board.  Trustees will have to take action to reconcile their post 97 contracted out liabilities, too. With HMRC sending individuals details of their contracting-out history from December 2018, all schemes will have to reconcile their GMPs and contracting-out liabilities against HMRC’s records accordingly – or accept HMRC’s word and a potential increase in its liabilities.

Without even mentioning this year’s flexibilities, the pensions landscape has changed dramatically in the last five years, but the constant has been the relentless push towards improving standards on data quality. Let’s hope that the next five years sees the results of that push come to fruition.

Comments