NEW YORK (CBSNewYork) — A pediatrician made a disturbing discovery: videos targeting children on YouTube and YouTube Kids that include tips on how to commit suicide.
With a name like YouTube Kids, you might assume the content is safe for the target audience, youngsters. But the pediatrician behind a blog has found YouTube Kids content with violence, sexual innuendo, and even suicide tips.READ MORE: Vaccine Mandate For NYC Teachers, Department Of Education Workers Put On Hold By Federal Judge
“Once this stuff starts to creep into platforms that are made for children, it is extremely concerning,” Dr. Free Hess told CBS2’s Tony Aiello on Tuesday.
On both YouTube sites, Hess has found a gamer video with a dark humor clip edited in comparing suicide wrist-cutting techniques. It was inserted into the kids video by someone unknown.
When asked her reaction to the man making the gesture on the video, parent Lorraine Romero said, “I just see the destruction of kids, and people that are sick.”
“We have no idea what seeing this content does to children,” Dr. Hess said. “Their brains are not fully developed, so they’re not able to think through complex situations such as the things that they’re seeing.”READ MORE: Gabby Petito's Father Announces Creation Of Gabby Petito Foundation Ahead Of Public Memorial Service
YouTube is owned by Google. A spokesperson told CBS News, “We rely on both user flagging and smart-detection technology to flag this content for our reviewers. Every quarter we remove millions of videos and channels that violate our policies.”
Hess said, yes, it’s tedious, but parents must monitor children’s screen activities and watch tutorials on parental controls.
“We need to educate ourselves about all of these platforms and which ones have which types of risks,” Hess said.
The internet, making life more convenient and more complex.MORE NEWS: Gov. Kathy Hochul Increases Pressure On COVID Vaccine Holdouts As Deadline For Health Care Workers Approaches
YouTube said it has strict policies against videos that promote self-harm, and is working on ways to more quickly remove content that violates standards.