Social media companies must address edited pictures fuelling negative body image
4 min read
In the United Kingdom over 1.25 million people suffer with anorexia or bulimia, more than one million have body dysmorphic disorder and between 500,000 and one million people use anabolic steroids.
As many as one in three teenagers feel shame about their body, and the same amount of girls and young women don’t like sharing pictures of themselves unless they have used filters or apps to change their appearance.
Poor relationship with body image is not a new phenomenon, but the advent of social media sites means the frequency, number, and type of images pushed and targeted to the end user is.
Adverts or social media posts should be labelled when a person’s body shape has been digitally manipulated
If you like a few photos of someone in the gym, your timeline fills with edited images of physically ripped men, some of whom have taken image and performance enhancing drugs to enhance their muscularity.
But how, if you’re quickly scanning your timeline, do you know whether the person in the image has used steroids to achieve that look? Or edited their body proportions through Photoshop or apps like Facetune? The problem is, you don’t.
I welcome the Advertising Standards Authority’s (ASA) deep dive into the digital manipulation of body images, the depiction of muscularity and its knock-on effects. But the question remains: what more can, and should, we be doing?
Many other countries are grappling with this and have introduced legislation for labels on images where body shapes have been manipulated or cracking down on steroid use by introducing licensing in gyms. We too must explore some of the solutions that are available to us.
I’ve been calling for companies to change their behaviour and am pleased that my Body Image Pledge, a voluntary commitment not to digitally alter body proportions, has been supported by Marks & Spencer, Boots, Dove, Boohoo Group and many more.
Much like the product placement “P” or including #ad in the captions of paid-for posts, I believe adverts or social media posts should be labelled when a person’s body shape has been digitally manipulated. It was encouraging to see this form part of the Women’s Health Strategy.
But we should be bolder and more strategic when it comes to data, the online world and Artificial Intelligence (AI). Over 1,000 companies are supporting Adobe’s Content Authenticity Initiative which works like the song-identifying app Shazam but for photographs and informs users of the image’s provenance, from creation to manipulation and publication. Possible applications are far reaching, including the assessment of deep fakes from the invasion of Ukraine.
It’s one thing dealing with doctored images, but the larger obstacle is the algorithm. Evidence from Facebook whistle-blowers showed the horrific effects of algorithms pushing diet adverts, self-harm videos and anorexia content to teenagers. Tragically, this was brought home in the case of Molly Russell, when the coroner concluded that social media played a “more than minimal” part in her death.
In my experience of trying to discuss algorithms with social media companies, I’m either stonewalled or told they’re commercially sensitive or too complicated. If we are to have effective oversight, this must be addressed. The Online Safety Bill goes some way to address this, but with the rapid advancement of AI capabilities we must find a balance which allows industries to grow while providing critical protections for the public. The AI Governance White Paper and the Data Protection (No. 2) Bill show some prospect of addressing these concerns, though this must be done swiftly.
Despite experiencing a new generation of technological development we, as humans, can’t escape our evolution. Conforming to social cues has been key to our survival – however this also makes us fallible to following unhealthy trends.
Unless we tackle what is under the bonnet of social media, the allure of what could be – however unhealthy and unattainable – will continue to harm many in our society who are being repeatedly targeted by algorithms.
Dr Luke Evans, Conservative MP for Bosworth
PoliticsHome Newsletters
Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.