One of the last things that Carson Bride did before taking his own life was look to his phone for help.
The 16-year-old had been receiving anonymous messages on Snapchat for months, according to a federal lawsuit filed Monday in California, through a popular Snapchat app called YOLO. The messages included sexual comments and taunts over specific incidents, like the time he’d fainted in biology class at his Portland, Ore., school.
The messages had to be coming from people he knew, but the app’s design made it impossible for him to know who was behind it. If he replied to the taunts, YOLO would automatically make the original message public, revealing his humiliation to the world.
His family found him dead on June 23, 2020. The history on his phone showed that he had been searching for “Reveal YOLO Username Online” earlier that morning.
Now Kristin Bride, Carson’s mother, is leading a lawsuit against Snap, YOLO and LMK, another anonymous messaging app built for Snapchat that the teenager used before his death. Her complaint alleges that the companies violated consumer protection law by failing to live up to their own terms of service and policies, and that anonymous messaging apps facilitate bullying to such a degree that they should be considered dangerous products.
The suit, filed Monday in federal court in the Northern District of California, seeks to form a class on behalf of the approximately 93 million U.S. users of Snapchat — a number that the company claims includes 90% of all Americans between ages 13 and 24 — along with the 10 million users of YOLO and 1 million users of LMK. Bride’s co-plaintiff in the case is the Tyler Clementi Foundation, a nonprofit formed to prevent bullying by the family of Tyler Clementi, who took his own life at age 18 in 2010 after experiencing cyber harassment by a dorm mate at Rutgers University.
Snap, YOLO and LMK did not immediately respond to requests for comment.
The lawsuit seeks to have YOLO and LMK immediately banned from Snap’s platform, and seeks damages for the alleged harms and misrepresentations.
“The high school students who anonymously cyberbullied Carson will live with this tragedy for the rest of their lives,” said Kristin Bride in a statement provided by Eisenberg & Baum, the firm representing the plaintiffs. “However, it is the executives at Snapchat, Yolo, and LMK irresponsibly putting profits over the mental health of young people who ultimately need to be held accountable.”
To date, those attempting to sue social media companies over their users’ words and actions have met with little success. Most cases against tech companies over content posted by their users are dismissed out of hand thanks to Section 230 of the 1996 Communications Decency Act, which states that no “interactive computer service” can be held liable for information posted by a user on that service.
But changes in the legal landscape and a novel legal argument may set this case apart.
In a ruling last week, the 9th Circuit Court of Appeals opened the door to the idea that social media companies — and Snap in particular — can be held responsible for building or enabling features that are so clearly dangerous to its users that the product is essentially defective.
That case centered on a Snapchat filter that automatically detected how fast the user was moving and let them add that number to a post on the platform. Plaintiffs in the suit argued that the feature incentivized driving at high speeds, leading to a fatal 2017 car crash in Wisconsin, in which a 17-year-old passenger pulled up Snap moments before the car hit a speed of 123 mph, then ran off the road and crashed into a tree.
The 9th Circuit reversed a lower court’s decision to dismiss the case out of hand over Section 230 protections, with Judge Kim McLane Wardlaw writing that “this type of claim rests on the premise that manufacturers have a ‘duty to exercise due care in supplying products that do not present unreasonable risk of injury or harm to the public.’”
In the case filed Monday, Bride and the Tyler Clementi Foundation argue that anonymous messaging features such as YOLO and LMK similarly present unreasonable risk of harm. To bolster this argument, the suit points to multiple generations of anonymous messaging apps targeted at teen users that have risen and collapsed in recent years, each brought down under the weight of abuse and harassment that they enabled, such as Yik Yak, Secret and Sarahah.
The suit cites research linking anonymous harassment and teen suicide to bolster this argument, including a 2007 study that found that students who experience bullying, online or in real life, are nearly twice as likely to attempt suicide. A subsequent study in 2014 found that cyberbullying may be even more dangerous, with online bullying tripling the risk for suicidal ideation.
But the suit also pursues a line of argument that draws from consumer protection law, arguing that Snap, YOLO and LMK failed to live up to their terms of service and other commitments to users.
A recent decision in the 2nd Circuit Court of Appeals, based in New York City, showed that this consumer protection argument might have legs. In that case, the plaintiff argued that the gay dating app Grindr should be held responsible for harassment he received on the app from an ex-boyfriend, who set up a fake account for the plaintiff and sent a stream of 1,400 strange men to his house looking for sex over the course of 10 months.
The court dismissed the case, citing Section 230 protections, and wrote that it was unclear that Grindr was in violation of its terms of service for a number of reasons. But the judge’s ruling indicated that a case showing a more clear-cut violation of those terms and agreements could have merit on consumer protection grounds.
A suit filed in early April against Facebook relies on a similar strategy. The civil rights group Muslim Advocates sued the social media giant alleging that it has failed to live up to its promises to users by allowing hate speech against Muslims to proliferate on the platform unchecked.
A prompt shown to first-time YOLO users warns that the app has “no tolerance for objectionable content or abusive users,” and that users will be banned for inappropriate behavior. Its privacy policy warns users that the company collects user information in order to enforce that zero tolerance policy.
But after discovering the bullying YOLO messages and search history on her son’s phone, Bride contacted the company through its website, sending a message about her son’s cyberbullying and resulting death. YOLO never responded, according to the suit.
Months later, Bride and her husband, Tom, again tried to contact YOLO through its website and via an email address provided for reporting emergencies, demanding that the company remove the bullying users from its platform. The email carried the subject line “Our Son’s Suicide — Request for Help.” The emergency reporting account replied only with a bounce-back message, saying that the recipient could not be reached due to an invalid address. Two subsequent attempts to contact the company also went unanswered, according to the suit.
LMK, the other third-party app in the suit, made even stronger claims about user safety, writing in its terms of service that it would “go to great lengths to protect our community” from inappropriate usage, including using “artificial intelligence technology” and human moderation to police its content.
And Snap, the suit alleges, has failed to live up to its own promises to users by allowing these apps to run on its service.
Snap allows other companies to build apps for its platform with a set of tools called Snap Kit, and makes clear in its guidelines that it actively reviews all third-party apps, noting that “encouraging or promoting violence, harassment, bullying, hate speech, threats and/or self-harm” is unacceptable, as is having “inadequate safeguards in place to prevent this type of behavior.”
Those guidelines also state that any app with anonymous user-generated content is subject to a longer review process, and if a third-party app is non-compliant or violates guidelines, Snap reserves the right to remove the app from its platform.
Snap has not removed YOLO or LMK from its platform, “even though it knows or has reason to know through numerous reports that YOLO and LMK lack adequate safeguards to prevent teen users from being victimized by harassment, sexually explicit materials, and other harm,” according to the suit.