Editor’s note: This segment addresses sexual violence.
Chicago resident and sexual assault survivor Tracy Lytwyn remembers laying on the bed, frozen.
The man she met through the dating app Bumble had removed his condom without her consent. She recalls thinking to herself, “Oh my God, is this actually happening to me?”
After some processing, she says she realized what happened to her that night in 2018 was a sexual assault and reported the incident to Bumble.
Removing a condom without consent is not a criminal offense, but advocates who see it as a form of assault call this stealthing. The potential for sexually transmitted diseases also makes the act unsafe.
After filing a complaint, Lytwyn received a response from Bumble saying they would look into the matter and thanked her for reaching out. But soon after, she noticed he was still active on the app. This time, she went public.
She tweeted at Bumble saying, “This guy who assaulted me is on your dating app and I’ve already tried making a complaint,” she says. Bumble didn’t hesitate to direct message her on Twitter saying they were able to ban him from their platform, she says.
But then a year later, she saw him on the app — again. This time, she says, proved to her that there’s no filter in place for making sure alleged perpetrators can’t access the app again. He likely created a new email address to work around the system that blocks certain users, she recalls Bumble telling her.
Lytwyn didn’t report what happened to Bumble to get personal justice. “I just wanted to make sure that other people in my community were safe from this person,” she says.
She didn’t go to the police because what happened to her isn’t considered a crime. But she believed she could take action through Bumble — a company that’s part of a multibillion-dollar online dating industry that has made pledges to protect users from sexual assault.
But these companies have done little to actually do that, according to an investigation by Columbia Journalism Investigations and ProPublica. Apps and websites like Match, Tinder and OkCupid employ moderators with no special training to handle a wave of reports — sometimes in four minutes or less.
“If you’re going to offer a service like a dating app, then you should have trained people in place,” Lytwyn says. “And it was really surprising to me that I was being connected with somebody who really had no background in how to help me.”
Reporter Elizabeth Naismith Picciani says to dig deeper into the story, Columbia Journalism Investigations and ProPublica put out a crowd-sourcing survey to hear from people who have been affected by sexual violence after using dating apps. They received a range of responses, from incidents of harassment to rape.
Like in Lytwyn’s case, Picciani says her reporting found several users “saw their alleged perpetrator back online and sometimes on another dating app as well.” While Bumble responded to Lytwyn, some other dating platforms are so overwhelmed with complaints about sexual assault that they’re not even getting back to people.
Moderators are under intense pressure to meet quotas, Picciani says. Moderators at Hinge, for instance, process up to 60 complaints an hour — one complaint per minute. Those Hinge moderators don’t respond to the victim, she says, but rather pull relevant details from the alleged perpetrator’s profile, such as birthday, username and name.
Other companies like OkCupid require moderators to gather that information and respond to both the complainant and accused in about four minutes on average, she says.
Moderators who can’t keep up with the time crunch to meet hourly quotas are set back for the rest of the workday, Picciani says.
Picciani and the investigation’s co-reporters spoke with many moderators across the dating app industry and discovered many felt there was no corporate guidance on handling sexual assault cases. Some might argue that moderators shouldn’t be allowed to ban a user without a criminal charge or fear that false sexual assault allegations may arise.
“My response to that would be to look at what the companies are saying publicly — and they have a lot of public promises about banning on [the] first accusation,” she says. “So that’s a standard they’re setting, and whether they’re following through with it is another question.”
She also points to research that shows it’s “quite uncommon for sexual assault allegations to be false,” she says.
On the surface, it may seem as though dating app company’s rickety systems set them up for potential lawsuits on behalf of people who’ve tried to alert them to take a user’s profile down.
But Picciani says many haven’t been held liable — even in cases where the company had been warned and harm occurred again — mainly because they successfully utilized Section 230 of the Communications Decency Act, which deflects lawsuits claiming negligence for incidents involving users harmed by other users.
To picture the original intention of the law, Picciani says to think of a Yelp review.
“If someone complains about a restaurant, Yelp isn’t liable for that user’s complaint of the restaurant,” she says. But now, Picciani says Section 230 has been extended to protect “offline harm and algorithms and how the platform is run from the inside.”
Dean Russell produced and edited this interview for broadcast with Todd Mundt. Serena McMahon adapted it for the web.
This article was originally published on WBUR.org.
Copyright 2021 NPR. To see more, visit https://www.npr.org.