Firms running their VoC programme through Trustpilot are doing one job — reputation management — which is important, but it doesn’t drive retention.
The gap between the two is where churn lives. This piece sets out the programme that teams need to run alongside Trustpilot to see a RoI from a VoC programme.
Picture the CX team at a mid-market UK firm. They own Trustpilot. They reply to reviews within the hour, in the right tone, with the right escalation paths. The board is happy that “we’re on top of reviews.” But the retention numbers haven’t moved. And nobody on the team can quite explain why responding to a one-star review is supposed to help.
By the time the review is written, the retention conversation is already over.
A one-star review is a leaving notice, not a feedback signal. The retention conversation needed to happen weeks or months earlier, and Trustpilot can’t see that conversation.
Review-site programmes capture a single moment in the customer journey: the moment a customer is angry and motivated enough to write something on the internet. By that point, the renewal decision has almost always been made. Replying within an hour with an apology and a link to a complaints process is reputation management. It is not retention. Reputation is important, but using Trustpilot as a VoC programme is closing the stable door after the horse has bolted.
There’s a related point worth making here: replying to a public review is not the same as closing the loop with the customer. The loop closes when the customer’s friction is found, fixed and acknowledged. But online review replies jump to the end of that process without doing the work of service recovery and improvement in the middle.
It isn’t a law of customer behaviour. It’s a symptom of a feedback programme that’s badly designed, and it’s self-reinforcing.
The standard line inside Trustpilot-led firms goes like this: “Surveys are dead. People don’t fill them in. The only customers who give feedback are the angry ones — that’s why Trustpilot works for us.” It sounds fair, but it’s more often than not a diagnosis of a broken programme.
The surveys most mid-market firms send are generic, badly timed, difficult to complete and impersonal. Of course response rates are low.
I was with an insurer last week who told me their survey response rates are “really, really, really low”. When you look at the surveys themselves, the reasons are not mysterious. This isn’t a one-off. In our experience, it’s still happening in every vertical we work with: insurers, banks, social housing, utilities and B2B.
When a survey programme is bad, you only ever hear from the angry. Which “proves” only the angry give feedback. Which justifies not investing in the survey programme. Which keeps the programme bad. Then a vendor promises you that “AI will fix the problem”, and the programme stays bad.
The fix isn’t more surveys. The fix isn’t AI.
It’s a better programme.
Timely, transactional, low-friction, surveys, sent at the moment a renewal, claim, or onboarding step actually happens. When a survey is shaped like that, response rates rise meaningfully. More importantly, the silent majority shows up. Including the satisfied. Including the about-to-leave-but-not-yet-angry. The exact group whose feedback would actually move retention. (Timing is doing more of the work here than most CX teams realise.)
Walk into a mid-market exec meeting and the question on the slide is rarely “how is our public reputation?” It’s “why is retention sliding?” or “what’s happening at renewal?” or “why is the cohort from Q3 churning faster than the cohort from Q1?”. Trustpilot cannot answer those questions. It was never designed to
Trustpilot tells you what a self-selecting subset of leavers thought after they’d already decided to leave.
It doesn’t tell you which journeys are leaking customers, at which moment, and why.
This is also why CX teams often feel stuck. The team is doing well at the job they were asked to do, and often trapped watching a metric they don’t own move in the wrong direction.
That’s not a people problem. It’s a tooling problem — and it shows up across financial services more broadly, anywhere a review-site programme has been stealthily promoted into a retention strategy.
This piece isn’t anti-Trustpilot. Trustpilot is good at the job it’s good at. The damage is done by putting it in a job it can’t do.
Trustpilot does PR and SEO well. This matters commercially, and is hugely valuable to growth. The CX team replying within the hour are doing something important. As I said to the insurer above, “I won’t bash Trustpilot — it’s your shop window. But it’s not the same as VoC.”
The fix isn’t to rip Trustpilot out. It’s to stop pretending it’s something it isn’t, and to stop using its presence as a reason not to build the programme that would actually answer the board’s question.
The retention work happens upstream of the one-star review A real VoC programme is shaped to catch those moments, not to clean up after them.
The point isn’t more feedback. It’s the right feedback, from the right customer, at the moment there’s still time to act on it.
This is the work CustomerSure does with clients: in-journey transactional VoC programmes that surface friction at the moment a renewal, claim or onboarding step goes sideways, while the relationship is still recoverable. It’s a different programme to a Trustpilot inbox, sitting alongside it rather than replacing it. (Covéa Insurance is a useful example of what it looks like in practice — a real VoC programme running across the journey, in the journey, in an insurance business that takes retention seriously.)
When a programme like that is in place, the shift it produces is straightforward. The silent majority starts showing up in the data. The dissatisfaction that used to surface as a one-star review three weeks later instead surfaces as a flagged response on the day, while the relationship is still recoverable.
Look at your last board pack. If the question on the slide about churn, retention or renewal was answered by a Trustpilot star rating and a response-rate stat, you are capable of so much more.
Trustpilot is fine. It’s just not the answer to every question.
If you want to talk about what an in-journey VoC programme looks like — what the surveys are shaped like, where they sit in the journey, and what changes once they're running alongside your Trustpilot work — we're happy to walk through it.
Survey fatigue isn't about customers being tired of feedback — it's about customers being tired of bad surveys. Here's why short ones win, and what to ask.
Read more