China-owned TikTok is facing claims that its algorithm steers violent videos towards black users rather than white users in a lawsuit blaming the platform for the death of a 14-year-old girl named Englyn Roberts.
Bloomberg reports that a recently filed lawsuit alleges that China-owned TikTok’s algorithm purposefully steers violent content to black teens over white teens. The claim comes as part of a lawsuit over the death of a 14-year-old black girl named Englyn Roberts. The complaint further names Facebook, Snapchat, and TikTok’s parent company ByteDance as defendants.
The lawsuit is the latest in a long line blaming social media companies for teens becoming addicted to their platforms and products. The parents of Englyn Roberts, who died in September 2020 approximately two weeks after she attempted to take her own life, allege that TikTok is aware of biases in its algorithm relating to race and socioeconomic status.
The complaint, which was filed on Wednesday in San Francisco federal court, claims that Roberts would not have seen and become addicted to harmful content that contributed to her death if not for TikTok’s biased algorithm.