What kind of material is Instagram recommending that teens view?
For its tests, both The Journal and Northwestern University computer science professor Laura Edelson created new Instagram accounts listing the age of each subscriber as 13. At first, Instagram sent these accounts videos of women dancing seductively or posing in a way that put an emphasis on their breasts. The accounts skipped other content but watched the videos that were sexually suggestive. As a result, Reels would then recommend content that was even more racy.
Instagram users as young as 13 are being sent sexually suggestive videos and images
The Journal says that adult sex content creators started showing up on the feed for the 13-year-old user after only three minutes. After less than 20 minutes of watching the Reels videos for the test accounts, they were full of promotions from these creators with some offering to send nude photos to Instagram subscribers willing to interact with their posts. The testing revealed that the short-form video platforms belonging to TikTok and Snapchat did not deliver the same sexually-focused content to underage users. A TikTok spokesman said, “We err on the side of caution,” and noted that teen subscribers have a limited pool of content that they can view.
Northwestern University’s Professor Edelson said, “All three platforms also say that there are differences in what content will be recommended to teens. But even the adult experience on TikTok appears to have much less explicit content than the teen experience on Reels.” While most of the testing took place from January through April, this month the Journal once again opened an account for a fake 13-year-old who watched only videos about women recommended by Instagram. After a half hour, the account was being sent videos about anal sex.
Meta responds just the way you would expect it to
Meta, as you might expect it would, shrugged off the results of the tests as not being representative of the overall experience of teens on Instagram. Spokesman Andy Stone said, “This was an artificial experiment that doesnβt match the reality of how teens use Instagram.” The spokesman added, “As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months.”
An internal analysis conducted by Meta in 2022 and reviewed by the Journal revealed that Meta knows that Instagram shows more pornography to teens than its adult users. The report showed that teens saw three times the number of posts containing nudity, 1.7 times the number of violent posts, and 4.1 times the number of posts showing bullying than those over 30 viewed.
Meta’s own internal analysis suggested that to resolve the problem, Meta build a separate recommendation engine for teens. So far, there does not appear to be any plan to do this. Part of the reason why teens are seeing such content is due to how the algorithm works. Content is recommended to users based on the type of content they view the most and linger over for the longest period of time.
πFollow more π
π bdphone.com
π ultraactivation.com
π trainingreferral.com
π shaplafood.com
π bangladeshi.help
π www.forexdhaka.com
π uncommunication.com
π ultra-sim.com
π forexdhaka.com
π ultrafxfund.com
π ultractivation.com
π bdphoneonline.com