This year the theme of Mental Health Awareness Week is on nature and the enormous benefits it has for our mental health and wellbeing.
In a year cut off from loved ones, with our lives pushed online and limited availability of real-life hugs, the wild green spaces of South West London have been fundamental to my own mental wellbeing. And one of the reasons that nature is so important as a source of comfort is that, unlike our activities online, nature will not track us or profile us in ways that may later be used against us.
Unlike the mindfulness app, the Facebook support group or the online depression questionnaire, the swathes of forget-me-nots on Wimbledon Common this week will not track me or target me for my mood or inferred vulnerability. They will just make me happy, however briefly. This week, we should reflect on how important that is for all our mental health.
Last week, the Markup revealed how pharmaceutical companies were able to target adverts for antipsychotic drugs used to treat bipolar disorder at Facebook users who had an interest in non-profit organisations focusing on mental health conditions. Reset Australia recently revealed how easy it is to buy advertising which targets teenagers who Facebook identifies as having an interest in “extreme weightloss”. Facebook was also accused of selling advertising based on real time insights into the anxieties and vulnerabilities of its teenage users, according to The Australian newspaper. Facebook refuted the claim, but real-time-bidding on sensitive user data is a very real feature of Ad-Tech in general.
And it is not only social media companies that are mining our mental states. In 2019, Privacy International revealed that the most popular mental health websites in the UK, France and Germany including, among others, the NHS, were sharing users’ data with third parties whose business was targeted advertising. In some cases, even the answers to depression self-assessment questionnaires were being shared.
Imagine speaking to your GP about feeling down only to find he was calling the local pizza restaurant to tip them off that you might need some comfort food. Or confiding in your therapist so that they could tell a political party how to make you so anxious that you would not leave your home to vote on election day. In the real world, these examples are clearly unethical and illegal. Mental health professionals would undoubtedly be struck off for this kind of thing. But somehow, we have failed to police the legal boundaries of our minds and our mental health in the online space.
Following the Privacy International report, the NHS stopped sharing data with third parties. But most of the other websites flagged in the report continued. In 2020, Privacy International raised a complaint against Doctissimo.fr with the French data protection regulator flagging the ways that this practice flouted data protection and privacy law. But the sale of mental health data involves more than just questions of privacy.
Data protection laws in the UK and the GDPR in the EU limit the way personal data can be gathered and used. But those laws must be interpreted in accordance with wider human rights laws. The rights to freedom of thought and the right to mental integrity in national and international human rights law demand the strongest protections for activities that interfere with our mental health and the way our minds work. But making those rights effective in practice requires serious regulation and enforcement.
The pandemic has sparked a mental health crisis like nothing we have seen since World War II and mental health support is disrupted, pushing people online to get help. But before we embrace technological fixes for mental health problems, we need to make the digital environment a safe space for mental health. Yesterday, Boris Johnson made a vow to ensure the internet was safe for all, especially for children, in the Queen’s speech. Addressing the mining of mental health data must be a cornerstone of that policy.