Today I’ve got to sidestep away from our usual waffling about fake tan and leggings and write about something that affected me quite a lot last night. I was having my usual browse through Instagram and came across a hashtag that caught my eye – #anorexia.
Out of intrigue I clicked on the link, thinking that surely there wouldn’t be that many posts associated with this mental illness? I was unfortunately really, really wrong with this assumption. What I found as I explored the topic deeper genuinely disturbed me, and eventually compelled me to write this post so that we’re all much more aware of what exactly is being portrayed on the photo sharing site that many of us use every day – and just how easily accessible this damaging content is for people of all ages.
There are over one and a half million posts dedicated to anorexia on Instagram. What the vast majority of these photos represent is a massive online community of those who are either suffering with, or are very interested in, anorexia. This community of users are regularly sharing genuinely disturbing content including ‘inspirational’ photos of emaciated girls, encouraging quotes related to not eating and weight loss, and frankly horrific images such as these below:
Users are actually interacting with these photos, actively encouraging each other to give up certain food or go without eating at all. Whether or not the people behind these accounts are genuinely following through with the promises is unknown, but the photos reveal the extent to which this community of impressionable young girls is thriving on the site.
Arguably, you’re always going to have this sort of content somewhere on the internet. But what I found most interesting is the part that Instagram itself actually plays in this pro-ana community.
When you click on the link, the above warning pops up. To me, this is actually really disturbing – it shows that Instagram are monitoring the hashtag, have recognised its connection to those suffering from a mental illness, and yet continue to give them the option to view it anyway. Is this really their idea of safeguarding? You could argue that it’s not Instagram’s responsibility to control what is shared on the site; but then when you consider what happens when you try to search for sexually explicit content:
The more I looked into the pro-anorexia content, I also came across the disturbing crossover into the world of self-harm and teenage depression. ‘#Cutting’ brings up the same warning as the anorexia tag, this time seeking to advise users on suicide and self harm. Once again though, they’re welcome to click through the warning and access over a million images focused around self-loathing and physical harm.
I’ve chosen not to screenshot any of the actual images themselves, but some of them are incredibly graphic shots of deep cuts and arms covered in blood. All bear captions claiming that the person behind the account has just made those cuts, and immediately uploaded the photos to Instagram. Here’s a snippet of just some of the hashtags used alongside these images, which again, Instagram are clearly making no effort to monitor or censor:
I won’t pretend that I understand what the people behind these accounts are thinking or feeling. Whether they’re all genuine sufferers, are crying out for attention or are just caught up in the subculture isn’t something you can assume from looking at any of the accounts. All you can really tell is how potentially damaging this all is. Most of the users are incredibly young, some describing themselves as being as young as 12 on their profiles. And if all you have to do to access images of sliced up wrists and starvation is to have access to a smartphone and click through one ‘warning’ from Instagram, then you can imagine just how many impressionable young people these warped images and ideas can be reaching. I have four nieces and to think that any of them could view content like this and be drawn into these communities is genuinely distressing.
The argument will always be there that responsibility for this lies with parents, who should be monitoring what sites their kids are accessing. In theory this is obviously true; in reality kids and teenagers will always find a way around it. There’s also the argument that this conversation, if removed from Instagram, would continue on other platforms – whether that’s Tumblr (where I know this is also a big problem), forums or chatrooms. However, surely Instagram – as one of the most popular and most accessed applications in the smartphone industry – have a responsibility to remove this sort of content from their site? No, it won’t solve the wider problem at hand – but that’s no reason to permit these communities to exist.
What do you all think of this topic? Should Instagram be doing more to prevent this kind of content?