There have been numerous reports in the past week of a character called “Momo” appearing in children’s videos on YouTube. Momo appears in clips of popular children’s shows posted on YouTube and tells children to hurt themselves and do terrible things through the characters of Peppa Pig and Fortnite, according to The Daily Dot.
These reports have led to YouTube removing all videos and ads related to the Momo character. YouTube is also receiving police warnings.
“I think the Momo character is very bad, it can affect children psychologically and be harmful to them,” Jack Lyme ’20 said.
The Momo videos have appeared despite previous efforts made by YouTube in 2018 to remove over a million channels after implementing tougher restrictions and guidelines to remain a safe and welcoming community.
Between July and September 2018, more than 8 million YouTube channels were taken down for various guideline violations, according to CNN Business.
Georgia Cohen ’21, a student with a YouTube channel, explained why YouTube is enforcing these policies.
“YouTube has been very strict about the things you say in your videos just because it is a very diverse platform in terms of age, so there are young kids watching,” Cohen said.
Cam Manna ’21, another student who runs a YouTube channel, takes time to ensure his videos are suitable to be posted because of the specific audience that watches his videos.
“Usually when we make a video we watch it over and over again just to be sure [it’s appropriate],” Manna said. “We have a younger audience watching it.”
YouTube videos can also get demonetized if they are deemed inappropriate by YouTube staff, meaning they are unsuitable for advertisers. Because of this, creators can’t make any money from those videos and in some cases, the videos will be removed from YouTube completely. These videos become unable to be viewed by the public because they go against creating a comfortable and secure social media platform on the website.
“I think the YouTube community is really great. It’s a space where you can share your ideas and make videos about things you truly like,” Cohen said.
According to YouTube’s Transparency Report, when determining which videos should be demonetized, YouTube looks for spam, misleading information, scams, nudity or sexual visuals, child unsafety, multiple policy violations, impersonation, promotion of violence and harassment or cyberbullying.
Once a YouTube channel receives three violations or strikes within 90 days, all of their videos and their entire channel will be taken off of the site for good.
“I don’t have a big platform [on YouTube]… but I’m more worried about big YouTubers being affected,” Cohen said.