This article first appeared on

Emalyn Thielking started self-harming when she was 12 years old. Soon after, she found a place where she could share her pain and connect with others who were hurting in all the same ways: social media.

It was a corner of the online world where teens in the U.S., Canada and from around the globe post images of self-inflicted cuts, fantasies about suicide and memes that appear to glorify both.

‘I felt very alone,’ Emalyn, now 18, told ‘Then I discovered on Instagram there were so many other people that feel so alone or feel so outcast, that there is a community of people … who feel the same way I do.’

‘It almost became competitive with the people you follow and who follow you back, like who can cut the most, who can cut the deepest, who can go the farthest … who’s the sickest,’ she said.

The images range from graphic and bloody to darkly humorous – from a teenager holding a blade to his wrist to the classic meme of Moe from ‘The Simpsons’ deciding against suicide with the caption, ‘Not today, old friend.’

Social media giants Instagram and Tumbler have been trying to strike a balance between protecting users and free speech as they respond to the trend, in which teens find and share the grim images using hashtags such as #suicide, #depression and #selfharm.

The issue came to a head after the death of 14-year-old Molly Russell in Britain, who is believed to have died by suicide after viewing such images on social media.

Her father, Ian Russell, has blamed Instagram for playing a role in his daughter’s death, saying she had previously been an outwardly happy, healthy teen.

It’s the kind of story that is terrifying to parents – especially given a recent study in the journal Pediatrics that found half of parents whose children have considered suicide have no idea that the thought has crossed their mind.

Suicide is the second-leading cause of death among Americans age 10-24, accounting for 17.3 percent of deaths in that age group – and the numbers have been rising each year since 2013, according to the Centers for Disease Control.

In the aftermath of Molly’s death, Instagram, which is owned by Facebook, has partnered with mental health experts to create new policy around the issue.  Earlier this month the company announced it would no longer allow content showing graphic self-harm or promoting suicide.

‘Nothing is more important to us than the safety of the people in our community,’ head of Instagram Adam Mosseri said in a blog post when the change was announced.

‘We have allowed content that shows contemplation or admission of self-harm because experts have told us it can help people get the support they need,’ Mosseri added. ‘But we need to do more to consider the effect of these images on other people who might see them. This is a difficult but important balance to get right.’

However, a spokeswoman said the company has chosen not to block the hashtags in an effort to continue serving users’ need for self-expression and to allow space for people who rely on the community for emotional support.

Now, when someone searches #suicide, a message pops up asking, ‘Can we help? Posts with words or tags you’re searching for often encourage behavior that can cause harm and even lead to death. If you’re going through something difficult, we’d like to help.’

The message is followed by options to get support from a helpline or a friend. Users can also choose to ‘see posts anyway.’

Tumblr offers a similar prompt, but officials did not respond to a request for comment.

At Instagram, they’re at the beginning of the process – images of people with fresh cuts and memes about wanting to die are still easy to find.

In the meantime, the company is no longer recommending that content – including images of healed wounds shared by many former cutters to promote body positivity and dispel the stigma that comes with those scars.

Despite the controversy, some experts caution against condemning the social media companies – or the teens posting the content.

‘Suicide is not going away; suicide on social media is not going away,’ said Dr. April Foreman, a psychologist and expert on social media and suicide prevention. ‘We’ve survived moral panics about rock ‘n roll … and Elvis Presley, and now we have the moral panic about social media and suicide.’

Foreman is quick to point out that social media is merely a reflection of society as a whole, noting that 8 million – 9 million Americans are at high risk for suicide at any given time, roughly 1.3 million will attempt it and an estimated 45,000 will die by suicide this year.

‘This is happening,’ she told ‘It’s the 10th leading cause of death. We would not expect suicide to not come up on social media … but we hold these social media platforms way more accountable than we would hold most therapists.’

That’s not to say that Foreman thinks such posts aren’t problematic or a reason for parents to be concerned.

‘If I had a child that was looking at memes about suicide all over the internet, we would go see somebody and I would be very worried,’ she said.

However she suggested a bigger concern is the lack of funding for research into suicide causes and prevention.

‘We don’t really know a lot about suicide in general so we don’t know much about how much causality these images have in people getting worse,’ she said. ‘We do know that it’s probably a bad idea if you have a young adult that’s considering suicide, if a mistake in the algorithm is sending them these (suicide-related) images, we know that’s probably not good.’

Stephen Brock, a psychology professor and program coordinator at California State University in Sacramento, said it’s also important to consider that the single biggest factor in suicide is isolation.

‘To the extent that social media can give (teens) a connection to pro-social, positive, life-affirming messages, it can be an aid,’ Brock told ‘But if that connection is to antisocial, death-oriented messages it can definitely promote suicidal ideation and promote suicidal behavior.’

Emalyn said that she has no desire to get back on Instagram and start searching for self-harm images – but she does recognize positive aspects to the community around suicide and depression hashtags.

It’s not uncommon to see messages of support on posts about suicide and self-harm. One user recently responded to a video about suicide by saying, ‘Look don’t hurt yourself. I know you are better than that.’

Cooper Johnson, 18, of Australia, is one of the people who actively and anonymously searches out depressing images and posts messages of support for those in need.

‘I think (it’s important) just showing them that they aren’t alone and there are other people out there that know exactly how they are feeling, someone that they can talk to,’ he told

Emalyn said those kinds of messages from strangers helped keep her going when things got tough.

‘It’s just (a feeling) like other people care about you when they like your posts or they comment on them (saying) that you’re going to be ok,’ she said.

In addition to the support she received, Emalyn said she gradually was able to distinguish the posts that were glorifying suicide and self-harm from those that were focused on recovery and encouragement.

‘It’s really easy now to see the divisions,’ she explained. Still, Emalyn said if she was a parent she wouldn’t want her kids to look at the kind of material she had been viewing.

‘I’d be terrified,’ she said. ‘I wouldn’t want them to be in that same (place).’

Her father, Dave Thielking, said he had no idea that his daughter was viewing any such images, but if he had he would have responded with ‘shock and concern.’

‘We had a rule that she could only use the internet one hour a day in our presence, but obviously as she got older that got harder and harder to control,’ he said. ‘In this day and age it’s hard to know what they’re accessing … I don’t know what I would have done.’

Foreman said that parents can use many different tools to limit their teens’ access to the internet and social media – both by filtering out specific content and by defining the hours during which phones and internet are accessible to youth.

‘There are all these wonderful ways I can dial up or dial down how my daughter has a digital life,’ she said.

Most computers and cell phones come with software programs that allow parents to limit access to certain apps, or create parent-controlled passwords in order to access specific programs or social media sites.

In addition, most internet service providers allow parents to throttle the Wi-Fi to specific devices at certain times of day or night to so kids only have access for a few hours a day or when a parent is home.

Foreman also recommends strong privacy settings on social networking sites.

Michelle Merritt, 53, of Starkville, Mississippi said she was concerned when her then-12-year-old daughter admitted she was accessing Tumblr for suicidal images and posts.

‘It broke my heart that her world was so dark,’ Merritt told ‘I just tried to help her myself, talking to her, getting her to tell me her feelings; trying to build up her self-worth.’

Ultimately, the adolescent ended up needing intensive therapy to deal with her self-harm and suicidal thoughts. She is now 24-years-old and thriving, Merritt said.

As heart-breaking as the social media aspect of her daughter’s suffering was, Merritt said depression and suicide runs in the family and that outside forces couldn’t be blamed for what the youth went through.

‘It made her feel better to be able to talk about it, even if she didn’t know the people out there who were listening,’ Merritt said. ‘I was glad she had an outlet, some place she felt safe talking about it. I felt like that helped her.’

However, for parents like Russell, who have actually lost a child to suicide, it’s harder to see the upside.

He told BBC last month that he has ‘no doubt that Instagram helped kill’ his daughter, a teen who he says had showed no outward signs of anxiety or depression prior to her death.

‘We are very keen to raise awareness of the harmful and disturbing content that is freely available to young people online,’ he said in a statement issued in January.

‘Not only that, but the social media companies, through their algorithms, expose young people to more and more harmful content, just from one click on one post,’ he added. ‘In the same way that someone who has shown an interest in a particular sport may be shown more and more posts about that sport, the same can be true of topics such as self-harm or suicide.’