15 August 2018

What's the right way to measure quality in banking?

James Daley

By James Daley LinkedIn

The CMA has forced banks to start publishing polling data, showing how likely their customers are to recommend them. It's an important step by the regulator, but do the results provide the right answers?

Although most of the industry are probably sunning themselves on a beach somewhere, today is quite a big day for the banking sector.

Two years on from the publication of the Competition & Market Authority’s final report into the banking industry, British banks have been forced to start publishing customer polling scores on their websites – showing how they stack up against their competitors.

It’s a landmark event – and not just in banking – as it’s the most deliberate move yet by a UK regulator to shift customer’s attention to quality, rather than price.

That has to be welcomed. For too long, the current account market has been all about teaser interest rates, cashback and one-time switching handouts. But given that you’re more likely to change your partner than your bank, there’s few financial decisions that are more important than choosing your current account provider.

It’s your primary financial relationship – the gateway to all your personal finances. It’s ludicrous that we might base our choice on whether we can get £100 or £150 for making the leap.

Winners and losers

The CMA data broadly shows what we would expect it to – First Direct on top, RBS on the bottom. That’s consistent with the polling that I’ve been doing in this sector for the past decade.

But there are some anomalies in there too. Co-op Bank seems to get a particularly hard time of things, coming in 13th out of 15 in the first set of ratings. In our polling, where we have a sample of almost 1,500 Co-op Bank customers, they are 8th out of 27 brands for net promoter score (ie customers who would recommend them), and 7th for customer satisfaction.

Metrobank also appears to somewhat overperform, coming in a close second to First Direct – where in our data they are a distant second, and have slipped to 6th out of 27 in our customer satisfaction poll.

The art of measuring quality

So now the CMA data is published, it feels as though it’s time to broaden the debate about how you really measure good quality in banking.

It’s a subject – as you might imagine – is close to our hearts.

Like the CMA, Fairer Finance also measures banks on customer polling. But that only accounts for about 50% of our ratings. The reason we don’t only look at that is because in my experience, customers are not always the best judge of whether they are getting a good deal.

A customer may be delighted with their bank, whilst not realising that they are paying over the odds in fees and charges, or that their bank is poor at managing complaints when things go wrong.

So we ask customers how happy they are with their bank, and how much they trust them, but these two elements only account for 50% of the overall score in our tables.

The other half is made up of an assessment of their performance at handling complaints – using data from the Financial Ombudsman Service – and the remainder is an assessment of transparency. We do this by mystery shopping their purchase journeys to see whether they are upfront with customers about the elements of their products which we know are least understood. We also assess the clarity of their literature and communications.

That gives a more rounded view of a business in my view.

A difference of opinion

When we launched our tables four and a half years ago, Metrobank was top. Since then, Metrobank has slipped to 11th out of 27 providers overall.

I focus on Metro because they seem to do incredibly well in the CMA data. And while they do have many strengths, I’m not sure they deserve the glowing endorsement that the CMA data gives them.

Similarly, the likes of Santander and Barclays do better than they deserve in my view. They are joint 5th in the CMA tables, compared to 19th and 24th out of 27 respectively in our tables. Both brands are mid table for customer polling in our data – based on samples of several thousand – but they are below average for complaints handling and transparency.

Part of the problem with the CMA data is that it only includes brands that have more than 150,000 customers. We go much further – and have almost twice as many brands in our tables – and even then we don’t get enough sample to include the newest challengers such as Starling and Monzo.

My worry about the CMA data is that for banks like Barclays and Santander, there will be pats on the back all round – when actually they still have lots to do.

FCA service metrics

At the same time as publishing the CMA data, banks have also been forced by the FCA to publish a range of other service metrics. Sadly, these provide even less utility for customers. The most useful piece of information here is whether banks have had any major operational or security incidents over the past six months. But no comparative data is provided, so it’s hard to assess whether the number is good or bad.

TSB’s data shows it had seven major incidents between April and June, the same number as First Direct. What that data doesn’t tell you is TSB services came to a grinding halt causing chaos. As for First Direct – who knows what those incidents were, or how severe.

These are important issues and need to be ironed out. The data needs to be better and we need to reconsider what the right questions are and what the right way to ask them is.

But none of this should detract from the significance of today’s publication. It’s an important step in a new direction for regulators. I hope this will be the beginning of much more work to help customers compare on quality, and not just price.