“This is a massive, massive crisis,” says Emma Pickering, head of the tech and economic abuse team at Refuge. “But we are not at all equipped to deal with it.”
The new arsenal
Tech abuse can take many forms. Abusers track partners via GPS-enabled apps, restrict access to accounts and rack up debts in their name. Some use so-called “spoofing apps” – tools that disguise a caller’s real number and location – to intimidate their target or impersonate them.
“They can then contact agencies and cancel different kinds of appointments,” Pickering says.
Other patterns are more disturbing still. Abusers are using data from period-tracking apps to monitor and exploit their partner’s fertility.
“We’ve sadly seen survivors say to us that perpetrators have used that information to assault them during times when they could be ovulating,” Pickering says, “because they want to get them pregnant as a form of control.”
MSI Reproductive Choices, which works to expand access to reproductive healthcare, sees the consequences of this directly. Caroline Day, a nurse and safeguarding lead at MSI, describes a recent case: a client who came in early in pregnancy disclosing severe domestic abuse – physical violence including strangulation and strict monitoring of her daily movements. She was pregnant, and “frantic” to access an abortion before an app would alert her partner.
Advertising helps fund Big Issue’s mission to end poverty
Last year, referrals to Refuge’s specialist tech abuse team rose by more than 62%. The anonymised case records – shared by Refuge with Big Issue – make for grim reading.
After Leila – not her real name – left her partner and applied to council housing to move away from the abuse, her ex hacked her email to intercept the replies.
Jayne – also not her real name – was pressured by her partner into sharing her login details to “prove she wasn’t cheating”. He then used access to her banking and shopping apps to take out a loan in her name.
“Power and control underpins everything in a domestic abuse and intimate partner violence setting,” says Leonie Tanczer, associate professor at UCL’s Department of Computer Science. “But now you don’t have to be physically present to make someone’s life really miserable.”
Read more:
Built without thinking
Tanczer runs UCL’s Tech and Gender Lab and has worked with perpetrator programmes to understand how abusers exploit technology – and how those systems might be redesigned. She herself started as a computer scientist, surrounded by engineers who, she says, often didn’t question what they were building.
Advertising helps fund Big Issue’s mission to end poverty
“I loved being surrounded by engineers, but some of the features that they proposed were quite… well, while they were very euphoric about them, I sometimes wondered [if they were] quite uncritical about some of them. And so I just started to think constantly about how some of these things could be misused.”
Companies do conduct threat modelling. But commercial incentives often override safety concerns. Tanczer understands the bind – if Apple doesn’t offer a feature, a competitor will – but argues there are practical safeguards that companies could implement today: discreet privacy modes, stronger permissions, delays or secondary verification before one device can access another.
“I understand that Apple or any company is in a really weird position because the majority of your clientele have a legitimate reason to have [an AirTag],” she says. “They want this. And if I wouldn’t offer it, someone else will, like Tile.”
But Pickering is unsympathetic with companies.
“Everyone’s just trying to create the new, innovative tool. No one’s really stopping and thinking about the risks around the advancement,” she says. “We’re moving at a pace that we’ve never seen before with tech, so that can come hard because it’s not created thoughtfully.”
The legal and policing system is not built to respond to these threats.
Advertising helps fund Big Issue’s mission to end poverty
When Mina reported the tracking to police, she was told no crime had been committed because she had “not come to any harm”. Leila received the same response. “They confirmed that they cannot arrest the ex-partner as he had not caused any harm with the information that he had,” Refuge noted in her case record.
Police forces often fail to understand how tech-facilitated abuse escalates, or that it constitutes abuse at all.
“There’s still so much work that needs to be done there, and the police are really under-resourced for this as well. We know a high proportion of crime relates to online offences, and yet they’re not given the digital tools, resources, training that they need to be able to respond,” Tanczer says.
Slow, expensive and exhausting justice
For those survivors who do pursue justice, the system is slow, expensive and exhausting. Reporting, investigation and prosecution can take years. And AI is now complicating the evidential picture further: deepfakes mean that genuine images of abuse can be contested in court.
“We are in a new environment where even if you have a photo, you have to prove that that photo wasn’t tampered with,” Tanczer says. “There’s this new sector of private investigators, but also digital forensic experts, that you have to kind of bring in to say, with a level of certainty, that photo is real.”
Some point to further tech solutions to these problems: apps that timestamp evidence for court, and chatbots trained to advise survivors on their legal options. Tanczer is sceptical.
Advertising helps fund Big Issue’s mission to end poverty
“The amount of money and resources we have poured into that, that could have gone to actual interactions with people. They’re not flawless. Think of an interaction with, let’s say, the internet service provider website. You end up in this loop and you just wait for the final prompt to say, speak to a human.”
What’s actually needed, she argues, is better-funded policing, specialist training, sustained investment in frontline services and a cultural shift that recognises tech abuse as coercive control.
“Fixing legislation is good and important,” Tanczer says. “But… it’s not the cure to this problem.”
The problem keeps evolving. Nobody can predict what the threat will look like in a year’s time, let alone five.
“The reality is, we don’t actually know what comes next,” says Pickering. “And I think that’s the concerning thing.
“We can foresee certain trends or areas of concern around particular technology, wearable tech, AI advancements, deep fake imagery, etc. But there could be something that comes onto the market in six months’ time that none of us are prepared for.”
Advertising helps fund Big Issue’s mission to end poverty
If you or someone you know is in danger, the government has a comprehensive guide to all the ways you can get help
Do you have a story to tell or opinions to share about this? Get in touch and tell us more.
Change a vendor’s life.
Buy from your local Big Issue vendor every week – and always take the magazine. It’s how vendors earn with dignity and move forward.
You can also support online:
Subscribe to the magazine or support our work with a monthly gift. Your support helps vendors earn, learn and thrive while strengthening our frontline services.
Thank you for standing with Big Issue vendors.
Advertising helps fund Big Issue’s mission to end poverty