The 14-year-old, from Harrow, north-west London, took her own life in November 2017 after viewing online content about self-harm, depression and suicide. An inquest into her death at North London Coroner’s Court was shown 17 clips she liked or saved on Instagram, which appeared to “fascinate young people”. Before the clips were played, coroner Andrew Walker told attendees to leave if they were likely to be affected by the footage. Lawyers were told in court and the coroner had discussed whether he should redact them beforehand because they were “so uncomfortable to look at”. “But Molly didn’t have that option, so we would actually edit the material for adult viewing when it was available in unedited form for a child,” Mr Walker said. Describing the footage the court was to see, the coroner said: “It is of the most harrowing nature and almost impossible to watch. “If you are likely to be affected by such videos, please do not stay to view them.” Turning to Molly’s family, the coroner said, “There’s no need for any of you to stay. “In my opinion, this video sequence must be seen [by the court].” An inquest at North London Coroner’s Court was shown 17 clips Molly liked or saved on Instagram which appeared to “harm young people”. (Family Leaflet/PA) The clips, which dealt with suicide, drugs, alcohol, depression and self-harm, were then played in court. Molly’s family remained in the courtroom as the videos were played, but the coroner chose to take a 15-minute break in proceedings afterwards. The schoolgirl’s family has been campaigning for better online safety since her death almost five years ago. Instagram’s guidelines at the time, presented in court, said users were allowed to post content about suicide and self-harm to “facilitate a rally for support” of other users, but not if it “encouraged or promoted” self-harm. On Friday, the head of health and wellbeing at Instagram parent company Meta defended the social media platform’s content policies – saying suicide and self-harm footage could have been posted by a user as a “cry for help”. Elizabeth Lagone, head of health and wellbeing at Instagram parent company Meta, defended the social media platform’s content policies (Beresford Hodge/PA) Elizabeth Lagone told the court that it was an important consideration of the company, even in its policies at the time of Molly’s death, to “consider the broad and incredible harm that can be caused by silencing (a poster’s) struggles.” Ms Lagone also denied that Instagram had treated children like Molly as “guinea pigs” when it launched Content Ranking – a new algorithm-based system for personalizing and ranking content – in 2016. Lawyer for Molly’s family, Oliver Sanders KC, said: “Isn’t it right that children, including children with depression like Molly, who were on Instagram in 2016 were just guinea pigs in an experiment?” He replied, “That’s not how we develop policies and procedures in the company.” Asked by Mr Saunders if it was clear it was not safe for children to see “graphic images of suicide”, the executive said: “I don’t know… these are complicated issues.” Mr Sanders drew the witness’s attention to experts who had advised Meta that it was not safe for children to view the material, before asking: “Were you previously told anything different?” Molly Russell’s father Ian Russell (centre), mother Janet Russell (right) and her sister (left) arrive at Barnet Coroner’s Court on the first day of the inquest into her death (Kirsty O’Connor/PA) Ms Lagone replied: “We have ongoing discussions with them, but there are many … issues we are discussing with them.” The court heard Molly created an Instagram account in March 2015 when she was 12 and was recommended 34, “probably more”, sad or depressing Instagram accounts. Of the recommended reports, Mr Saunders said one related to self-harm, one to hiding, four to suicidal feelings, one to “not being able to go on”, two to mortality and one to burial. On Thursday, Pinterest’s head of community operations, Judson Hoffman, apologized after admitting the platform was “not safe” when the 14-year-old used it. Hoffman said he was “deeply sorry” for the Pinterest posts Molly had seen before her death, saying it was material he “wouldn’t show my children”. The investigation, which is expected to last up to two weeks, is ongoing. If you are feeling distressed and isolated or struggling to cope, the Samaritans offer support. you can speak to someone free of charge over the phone, confidentially, on 116 123 (UK and ROI), email [email protected] or visit the Samaritans website to find details of your nearest branch. For your local services, the national mental health database – Hub of Hope – allows you to enter your postcode to search for organizations and charities offering mental health advice and support in your area. Additional reporting from the Press Association