Siri Is Now Equipped To Help Survivors Of Sexual Assault

siri

Over the past couple of years there have been a few apps that try to take on the massive problem that is sexual assault, to varying degrees of success. There’s We-Consent, the well-intentioned but clunky as hell consent app; Kitestring, which lets you tell friends where you’re going and informs if you don’t respond; and there’s Circle of 6, which lets users choose six trusted people to inform in case of an emergency, including sexual assault. And now, there’s Siri.

Yup, that same voice that may or may not understand WTF you’re saying when you say “Siri, where is the nearest place to buy fried chicken?” is now at least minimally equipped to help when someone has been sexually assault. According to BUST, Siri was previously incapable of understanding what someone was saying when they mentioned rape or sexual assault. And I know, Siri isn’t so great at understanding a lot of things but she at least knows what “fried chicken” means and can maybe find a Popeye’s nearby, right?

With this new update, however, Siri can now understand phrases such as “I was raped” and “I am being abused.” The update went live on March 17, just three days after a study about how virtual assistant technologies weren’t programmed to respond in case of “personal emergencies” was published in JAMA Internal Medicine Journal. The phrases added to Siri’s index were culled from a list of the most common words and phrases that come through on the online and phone hotlines of the Rape, Abuse, and Incest National Network (RAINN).

While in the past Siri would have responded with something like “I don’t know what you mean by ‘I was raped,’” people reaching out for help will now receive a suggestion that they “may want to” reach out to someone at the National Sexual Assault Hotline. Jennifer Marsh, RAINN’s Vice President for Victim Services, told ABC News that the language was deliberate, softening from Apple’s original “you should reach out to someone” in order to take the fragile emotional state that people are in immediately after an assault.

“We have been thrilled with our conversations with Apple,” Marsh said. “We both agreed that this would be an ongoing process and collaboration.”

One of the main criticisms of apps like We-Consent (which has, rightfully, gained a lot of critical press) is that while they’re obviously well-intentioned, a lot of them seem to have been developed without any expertise on sexual assault on board. Apple’s decision after the study found that they weren’t doing as well as they could to turn directly to RAINN — arguably one of the top organizations combatting sexual assault and providing support for survivors in the country — is pitch-perfect.

No app will ever be able to “solve” sexual assault or provide everything that a survivor needs. However, the quicker that people have access to the resources and experts who can help them, the better. We use the internet to find out about basically everything these days and an increasing number of us are using our phones as our first point of access to the internet. This new update to Siri, then, could be the lifeline that people so desperately need.

Image: Pexels

Be first to comment