The Kali Hansagovernment is cracking down on image based sexual abuse in England and Wales.
Deepfake porn — which uses editing technology to make and share fake images of a person without consent — will be criminalised under the new measures announced today.
Intimate image abuse — often referred to as "revenge porn" — will also be easier to prosecute under the new measures. Lawyers will no longer have to prove an intent to cause distress under new amendments to the UK's Online Safety Bill and those found guilty face six months in prison.
When it is proven a perpetrator also intended to cause distress, alarm, humiliation, or to obtain sexual gratification, the crime carries a two-year prison term. Anyone found guilty of sharing an image for sexual gratification could also be placed on the sex offender register.
"We are cracking down on abusers who share or manipulate intimate photos in order to hound or humiliate women and girls," Lord Chancellor and Secretary of State for Justice, Alex Chalk KC, said in a statement. "Our changes will give police and prosecutors the powers they need to bring these cowards to justice, safeguarding women and girls from such vile abuse."
Survivors report finding the term "revenge porn" harmful because of the implication that they have done something to deserve the violation. As I note in my non-fiction book Rough: How violence has found its way into the bedroom and what we can do about it,"Research by the Cyber Civil Rights Initiative in 2017 found that the use of the word ‘revenge’ is a misnomer, given that revenge or hurting the victim is not the motivation for 79 percent of those sharing non-consensual images or videos. Legal academics also argue that the term ‘revenge porn’ does not fully convey the distress, trauma and humiliation that this type of abuse causes, and they suggest ‘image based sexual abuse’ as a more fitting term."
Non-consensual pornography and intimate image abuse are terms preferred by survivors.
SEE ALSO: 97% of young women have been sexually harassed, study findsStatistics show that image based sexual abuse is alarmingly prevalent. According to government figures, one in seven women and one in nine men aged between 18 and 34 have experienced threats to share intimate images. Police received 28,000 reports of the non-consensual disclosure of private sexual images between April 2015 and December 2021.
According to the government announcement, deepfake image abuse has been on the rise in recent years, "with a website that virtually strips women naked receiving 38 million hits in the first eight months of 2021."
"The unsolicited sharing and manipulation of intimate photos is a cowardly and revolting thing to do and has an absolutely devastating impact on the lives of women and girls across the UK," Minister for Technology and the Digital Economy, Paul Scully, said in a statement.
If you have experienced sexual abuse, call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org. If you are based in the UK and have experienced intimate image abuse (aka revenge porn), you can contact the Revenge Porn Helplineon 0345 6000 459.If you have experienced sexual violence and are based in the UK, call the Rape Crisis helpline0808 802 9999.
Genius said it used morse code to catch Google stealing lyrics2016's magazine covers increased in diversity, but nowhere near enough'Toy Story 4' justifies its own existence by questioning it: Review9 hidden iOS 13 features that Apple didn't announce at WWDCByton’s massive 48Serious political meeting upstaged by notLittle girl sends BBC anxious letter about Big Ben, gets the perfect reply9 hidden iOS 13 features that Apple didn't announce at WWDCThis photo of 'Superman' Henry Cavill holding a tree is baffling the internet'John Wick Hex' creator Mike Bithell talks unusual prequel: InterviewQuintessentially Canadian video shows peckish moose licking salt off carJake Gyllenhaal reveals to Ellen what he wears to bedNuro autonomous vehicles will deliver Domino's pizzaThe ballsy realism of HBO's Euphoria is worth the risk: Review'Big Little Lies' episode 2 recap: It's ironically about telling the truthA nonzero number of people have had sex with AirPods inThe ballsy realism of HBO's Euphoria is worth the risk: ReviewTwitter mocks Donald Trump for 'unpresidented' spelling mistakeHands on the wheel with Tesla's new inApple's Tim Cook urges Stanford students to take responsibility 14 dynamite gifts for the Guy Fieri fans of the world Everything to know about Disney Channel's truly wild original movies (DCOMs) Verizon's Stream TV can't stream Netflix Google Maps adds button to translate addresses and directions This seal delightedly hugging a toy version of itself is your new wallpaper Twitter's political ad ban just hit, and it's already pissing off conservatives Here's how Ryan Gosling reacted to that wild Best Picture twist 'Moana' star gets whipped in head with a flag mid Lily Allen quits Twitter after trolls attack her over son's death Tortoise adopts a tiny bunny, proving even so Facebook ad scam tricks users with images and video of Kickstarter products Netflix's 'Earthquake Bird' gets spoiled by a bad ending: Review This giraffe birth livestream is driving the internet insane The Oscars red carpet was its own live action 'Beauty and the Beast' Photos of Ford's electric Mustang Mach Pretend you have the worst job with Facebook's content moderation quiz Disney XD sprinkles a same Twitter quickly made a glorious meme out of that massive Oscars mess Why you should watch 'Sister Act 2' on Disney+ How to predict the president's next bogus tweet: Just watch Fox News
2.749s , 10137.046875 kb
Copyright © 2025 Powered by 【Kali Hansa】,Warmth Information Network