Read the Beforeitsnews.com story here. Advertise at Before It's News here.
Profile image
By Human Wrongs Watch
Contributor profile | More stories
Story Views
Now:
Last hour:
Last 24 hours:
Total:

How Tech Became the New Frontier of Domestic Violence against Women and Girls

% of readers think this story is Fact. Add your two cents.


Human Wrongs Watch

The UK government will not meet its pledge to halve violence against women and girls unless it tackles tech companies
 


One in three women in the UK has experienced online abuse or harassment | Getty

24 September 2025 (openDemocracy)** — From hiding spycams in children’s toys to coercing partners into online sex work on platforms such as OnlyFans, abusers are increasingly weaponising technology to perpetrate new and insidious forms of violence against women and girls (VAWG).

One in three women in the UK has experienced online abuse or harassment, with almost one in five of them reporting that the perpetrator was a partner or former partner, according to research we at Refuge, the UK’s largest specialist domestic abuse charity, carried out in 2021.

For young women, the scale of the abuse was even higher, with two in three reporting online abuse or harassment.

Perpetrators of domestic abuse often use technology to extend existing patterns of coercive control.

Location-tracking tools, instant messaging, and social media have made it easier than ever to monitor, harass and intimidate partners or ex-partners under the guise of ‘normal’ online behaviour.

We’re increasingly seeing the use of stalkerware, hidden trackers and social media ‘maps’ to surveil survivors – part of a broader pattern of control over what survivors do and who they see.

The issue is now so great that this summer, a little-publicised Home Affairs Select Committee report found that the government is unlikely to fulfil its pledge to halve VAWG unless it introduces effective strategies to tackle online misogyny and tech abuse and increases coordinated funding to do so.

At Refuge, we have seen this crisis unfold firsthand. In 2017, in response to soaring demand, we established a specialist team to support survivors of tech-facilitated and economic abuse.

Referrals to this team increased by 205% between 2018 and 2024, highlighting an urgent need for specialist support.

Stalking was one of the most common harms reported to this team last year, with perpetrators using social media to repeatedly message and keep tabs on their victims.

One survivor, who was stalked and harassed on social media by a man they met on a dating app, told us:

“Somehow he managed to find all of my social media accounts and even those of my friends and family. The messages just kept coming, and no matter how many times I blocked him, he would manage to create new profiles and continue to harass me.

“At first, the police didn’t take the situation seriously and it took me filing an official complaint for them to press charges,” she said. “Just because abuse is happening online, it doesn’t mean that the effects of it aren’t serious.

My anxiety and depression spiralled because of the abuse and I had to take time off work. I was terrified that he was behind every corner.”

Stories like this have led us, along with others in the sector, to urge Ofcom, the UK’s independent internet watchdog, to include stalking as a standalone ‘key harm’ in its upcoming guidance for technology companies on minimising VAWG, which it is required to publish under the Online Safety Act 2023.

This would ensure perpetrators are held accountable and survivors are better protected.

We also urgently need stronger regulation around surveillance and intimate image abuse, which perpetrators commonly use to humiliate, distress and psychologically harm survivors.

Many survivors tell us their abusers have used hidden spycams – embedded in everything from furniture to children’s toys – to film them without their knowledge, and then used the footage to blackmail and control them.

While it is illegal to share intimate images without consent or even to threaten to do so, there is a loophole for perpetrators who create intimate images without their victim’s consent but don’t share them, despite this still being a serious violation of survivors’ autonomy and privacy.

Proposals in the Crime and Policing Bill to make it an offence to even install equipment intended to capture intimate images without consent are a welcome first step towards resolving this, but the government must go further still.

Ministers must urgently follow through on their commitment to criminalise the taking and making of all intimate images without consent – including the use of AI to generate deepfake nudes from real images.

So-called ‘nudification’ apps, which are specifically designed to produce these images and videos, have boomed in popularity over recent months, leading Australia to announce plans to ban them earlier this month.

This issue of perpetrators weaponising intimate images is being exacerbated by a lack of safeguards on popular sex work platforms, which not only allow abusers to flourish but also to profit from the abuse.

Survivors report being coerced into creating content or having intimate photos or videos of themselves uploaded without consent.

Even where laws are supposedly in place to protect survivors, awareness and reporting remain dangerously low – and when intimate image abuse crimes are reported, they are rarely prosecuted.

The gap between the legislation and survivors’ lived experiences is yet another example of policy failing to keep pace with technology.

A recent UK-wide poll commissioned by Refuge found that fewer than one in three people would report certain forms of digital coercion, such as location tracking or a partner demanding access to their phone, if they happened to them or someone they knew.

Just 58% said they would report the non-consensual sharing of intimate images, falling to 44% among 18- to 24-year-olds.

This exposes a worrying lack of awareness among Gen Z; abuse thrives when it is minimised or dismissed, and this lack of recognition only compounds survivors’ trauma.

Ofcom’s draft guidance on how online platforms should respond to VAWG should be celebrated, particularly with the inclusion of online domestic abuse as a ‘key harm’ that tech companies must respond to.

Refuge also strongly supports the watchdog’s recommendation that platforms should scan for duplicates of all non-consensual explicit material (and ensure such content is delisted from search results) as a welcome step towards tackling the spread of intimate images shared or made without consent.

At the same time, we are highly concerned that when it’s rolled out after the consultation process has concluded, the guidance will be hamstrung by the fact that tech companies will not be legally bound to comply.

Time and time again, we see companies prioritise profits over women’s safety – and it would not be surprising to see such behaviour continue.

This is why Refuge and others in the sector are calling for the guidance to be elevated to a legally binding Code of Practice, backed by a cross-government department commitment to tackling tech abuse in the forthcoming VAWG strategy.

A piecemeal approach is not enough; only a whole-system response can confront the systemic weaponisation of technology against women and girls, and this must include policy frameworks which are effective at ensuring companies take action to prevent VAWG and are held to account where they fail to do so.

More generally, we need a fundamental policy shift towards designing technology with ‘safety by design’ – not bolting on protections as an afterthought.

In internal research carried out by Refuge in March and April 2025, we found that certain AI chatbot models gave inappropriate responses to prompts about survivors seeking help – including advising them to stand up to their abuser, a suggestion that could put women at serious risk.

This is just one example of the potentially fatal consequences of treating safety as optional. All AI systems and social media platforms must be safety-tested from the outset, with survivor input and consultation with VAWG experts like Refuge.

Meaningful change will come only when survivor voices are embedded in both technology development and regulation, and when tech companies are held fully accountable for the harm they enable.

  • From deepfakes to spycams, technology is the new frontier of abuse. Refuge’s virtual Tech Safety Summit (23-24 Sept 2025) will bring together leading experts to showcase innovative solutions to this growing crisis, featuring Adolescence writer Jack Thorne.
  • Join us in shaping a safer future: Refuge’s UK Tech Safety Summit 2025

*Emma Pickering is head of the Tech-Facilitated Abuse and Economic Empowerment team at Refuge

Read more

**SOURCE: openDemocracy. Go to ORIGINAL: https://www.opendemocracy.net/en/technology-domestic-abuse-violence-against-women-and-girls-refuge-government-ofcom/ 2025 Human Wrongs Watch

 

 


Source: https://human-wrongs-watch.net/2025/09/27/how-tech-became-the-new-frontier-of-domestic-violence-against-women-and-girls/


Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world.

Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.

"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.

Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world. Anyone can join. Anyone can contribute. Anyone can become informed about their world. "United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.


LION'S MANE PRODUCT


Try Our Lion’s Mane WHOLE MIND Nootropic Blend 60 Capsules


Mushrooms are having a moment. One fabulous fungus in particular, lion’s mane, may help improve memory, depression and anxiety symptoms. They are also an excellent source of nutrients that show promise as a therapy for dementia, and other neurodegenerative diseases. If you’re living with anxiety or depression, you may be curious about all the therapy options out there — including the natural ones.Our Lion’s Mane WHOLE MIND Nootropic Blend has been formulated to utilize the potency of Lion’s mane but also include the benefits of four other Highly Beneficial Mushrooms. Synergistically, they work together to Build your health through improving cognitive function and immunity regardless of your age. Our Nootropic not only improves your Cognitive Function and Activates your Immune System, but it benefits growth of Essential Gut Flora, further enhancing your Vitality.



Our Formula includes: Lion’s Mane Mushrooms which Increase Brain Power through nerve growth, lessen anxiety, reduce depression, and improve concentration. Its an excellent adaptogen, promotes sleep and improves immunity. Shiitake Mushrooms which Fight cancer cells and infectious disease, boost the immune system, promotes brain function, and serves as a source of B vitamins. Maitake Mushrooms which regulate blood sugar levels of diabetics, reduce hypertension and boosts the immune system. Reishi Mushrooms which Fight inflammation, liver disease, fatigue, tumor growth and cancer. They Improve skin disorders and soothes digestive problems, stomach ulcers and leaky gut syndrome. Chaga Mushrooms which have anti-aging effects, boost immune function, improve stamina and athletic performance, even act as a natural aphrodisiac, fighting diabetes and improving liver function. Try Our Lion’s Mane WHOLE MIND Nootropic Blend 60 Capsules Today. Be 100% Satisfied or Receive a Full Money Back Guarantee. Order Yours Today by Following This Link.


Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

MOST RECENT
Load more ...

SignUp

Login