18th February 2019
Regulator's banking survey still throwing up odd results
The latest round of the CMA's service quality survey for banks has been published. The results continue to be out of kilter with other surveys in the market.
18th February 2019
The latest round of the CMA's service quality survey for banks has been published. The results continue to be out of kilter with other surveys in the market.
Last summer, the Competition & Markets Authority began forcing banks to publish details of a new independent customer service poll on their websites and in their branches.
Back then, you may remember that we were a bit mystified by the first results. While First Direct were top and RBS were bottom - consistent with the polling that I've done in the sector over the past decade - there were plenty of anomalies. Metro Bank seemed to be overperforming compared to our data, while Co-op Bank appeared to be massively underperforming.
The data has to be updated every six months, and this weekend round two of the survey was published.
First Direct, who ran a national advertising campaign on the strength of coming top in the first wave, was surprisingly knocked off its perch by Metro Bank, who went one better than last time. Yet another bizarre result from a poll that continues to sit at odds with other customer surveys in the market.
Co-op Bank - which comes 4th out of 29 banks in our customer experience tables - continues to do terribly in the CMA rankings, remaining in 13th position out of 15, after sustaining a slight drop in the percentage of customers who would recommend it.
In our polling, Co-op Bank is ranked 7th out of 29 for happiness, 7th for Trust, and is the 7th most likely bank to be recommended by its customers. This is based on polling carried out over the last five years - and we just can't square our data with the CMA results.
Nowhere else that I've looked seems to be coming up with the same results as GfK, who are carrying out the polling for the CMA.
Which? - who has much smaller sample sizes than us - puts Metro Bank 4th for customer service, a whole 9 percentage points behind First Direct (who loses out only to Monzo).
Co-op Bank comes 9th out of 23 in the Which? survey. And in Moneysavingexpert's six monthly survey, First Direct is a clear winner out of 11 brands (not including Metro) and Co-op comes third.
Although I support the CMA's initiative to help customers compare on quality, the inconsistency of the results is alarming - and may well soon lead to some outcomes which will embarrass the government.
Last year, the fragility of customer polling was exposed in a Times story about travel insurer Holidaysafe. The Times alleged that a number of poor decisions taken by Holidaysafe and their partners had led to some of their customers suffering a deterioration in their medical conditions, or worse. In the days before the story was published, Holidaysafe was ranked by Which? as one of its recommended brands, on the strength of a good customer poll and good product features.
As soon as it was made aware of the story, it took its recommendation down.
Anyone who provides ratings - and I of course include Fairer Finance in this - takes a risk. Ratings are an art rather than a science, and you can't know what lies around the corner for a brand. TSB was doing swimmingly in most bank ratings before its IT disaster. No rating system could have foreseen what was to come.
But given that we know the limitations of customer polling, the CMA runs a risk by giving so much prominence to this one metric. The reason Holidaysafe and other associated brands did not do well in our rankings is that their underwriter has a poor track record at the Ombudsman, and they don't do very well in our transparency assessment. From a polling perspective, they do fine - and if we had based our ratings on that alone, they may well have ended up being recommended.
All this is to say that customer polling can be a useful tool - but it shouldn't be used in isolation. Yet that is what the CMA has mandated. Sooner or later, its poll will inevitably be shown up - which may set the whole debate on quality backwards.
Now that it's got the ball rolling, and began shifting the focus away from quality, it needs to work on promoting a more nuanced set of ratings.