Big White Wall: One of four NHS apps found to be clinically effective

3rd November 2015

Big White Wall

Most apps designed for mental health sufferers — including those endorsed by the NHS — are clinically unproven and potentially ineffective, a new study has shown.

In research published in the journal Evidence Based Mental Health, a team at the University of Liverpool found that many mental health apps and online programmes lack “an underlying evidence base, a lack of scientific credibility and limited clinical effectiveness”. Mental health apps have grown in popularity at a time when psychological services have faced an increased demand and decreased resources. Referrals to community mental health teams and crisis services have increased by 15%, despite a loss of around 200 full time mental health doctors and 3600 nurses.

Jen Hyatt is CEO of Big White Wall, an online community for those experiencing mental health problems. She’s passionate about what she calls the “transformative” role of mental health apps.

“They can provide access to services from the comfort of the home. Many people find it hard to access services because of geography, because of mental ill health, because of physical disability. We’ve also found that, in the 50% of cases that do get to a GP, they’re not able to guide mental health problems adequately.”

The four NHS apps found to be clinically effective were Big White WallMoodscope, a self-tracking and peer support network, Happyhealthy, a mindfulness app, and WorkGuru, an occupational stress-management programme.

Big White Wall and WorkGuru, among others, are keen to make sure that mental health apps are clinically sound and socially responsible — but many other apps fail to replicate this eye for detail. There are thousands of unverified mental health apps available for Apple and Android, encompassing mindfulness, CBT, mood tracking, peer support and more. So how can we make sure we’re not being duped?

Jen Hyatt also ensures a similar process for Big White Wall.

“I have no tolerance for developers who try to avoid taking responsibility for the safety of people online. We have a responsibility to our users – it’s the only route to a good, rigorous resource.”

“We have support staff 24/7. We have data analytics, tests we use to screen for tests, and a clinical governance handbook that has protocols for issues like suicide ideation, self harm and other crises. They can be escalated to a clinical psychiatrist within two minutes.”
“And Google and Android should make prominent those apps that have this kind of strong basis. The whole industry has a responsibility to promise those that work.”