Dating Apps: Digital Secretary To Question Tinder And Grindr About Child Safety Measures

The Digital Secretary said he will write to Tinder and Grindr asking what measures they have in place to keep children safe, after an investigation claimed they are at risk of exploitation on such apps.

Police have investigated more than 30 incidents of child rape since 2015 where victims were sexually exploited after evading age checks on dating apps, the Sunday Times said.

The newspaper said records obtained through Freedom of Information laws showed 60 further cases of child sex offences via online dating services, including grooming, kidnapping and violent sexual assault.

The youngest victim was eight years old, the paper said.

Jeremy Wright, Secretary of State for Digital, Culture, Media and Sport, described the findings as “truly shocking”.

He added: “I will be writing to these companies asking what measures they have in place to keep children safe from harm, including verifying their age.

“If I’m not satisfied with their response, I reserve the right to take further action.”

Grindr told the Sunday Times: “Any account of sexual abuse or other illegal behaviour is troubling to us as well as a clear violation of our terms of service.

“Our team is constantly working to improve our digital and human screening tools to prevent and remove improper underage use of our app.”

Last week a man who spent the night with a 12-year-old girl who he thought was 19 and had met on a popular adult dating app was jailed for two-and-a-half years.

Carl Hodgson, 28, invited the child to his flat in Manchester city centre a few days after they first made contact via an app.

He pleaded guilty to causing or inciting a child under 13 to engage in sexual activity, engaging in sexual activity in the presence of a child, distributing an indecent photograph of a child and making indecent photographs of a child.

Tinder said it uses both automated and manual tools to moderate users, including scanning profiles for ‘red flag’ images, and said it also depends on users to report profiles that may belong to a minor.

A spokeswoman said: “We utilise a network of industry-leading automated and manual moderation and review tools, systems and processes – and spend millions of dollars annually – to prevent, monitor and remove minors and other inappropriate behaviour from our app.

“We don’t want minors on Tinder.”

Wright said the paper’s findings provide “yet more evidence that online tech firms must do more to protect children”.

It comes after Instagram pledged to ban graphic images of self-harm after Health Secretary Matt Hancock said social media companies “need to do more” to curb their impact on teenagers’ mental health.

The announcement followed the death of 14-year-old Molly Russell, whose family found she had viewed content on social media linked to anxiety, depression, self-harm and suicide before taking her own life in November 2017.