Vermont lawmakers target AI sexual abuse

code projected over woman
Photo by ThisIsEngineering on Pexels.com

by Aubrey Weaver, Community News Service

Deepfakes — fake images and videos made with machine-learning technology that can realistically depict people’s faces  — have made headlines in recent years for tricking people and even the news in a lot of relatively harmless ways, such as a viral image of Pope Francis sporting a Rihanna-esque puffer coat.

But lawmakers and advocates say photo-generative software represents a serious escalation of “image-based sexual abuse”: the practice of blackmailing or extorting an individual by threatening to leak nude or sexual images of them. 

Those threats have been increasingly made using fake images or videos created by artificial intelligence software that can put a person’s face on bodies that aren’t their own. Vermont — and all but a few states — has few legal protections against that behavior.

In July 2015, a Vermont law took effect to address a kind of abuse often called “revenge pornography,” criminalizing the disclosure of sexually explicit images without the consent of the person depicted. Almost every other state has a law like that, but only four have provisions that deal specifically with deepfakes, according to the Cyber Civil Rights Initiative, and Vermont isn’t one of them. 

One of the two primary sponsors of that 2015 legislation, Rep. Barbara Rachelson, D-Chittenden 6, is working on an amendment to that law to better protect against new uses of technology to harass people online.

Rachelson attended a briefing at the White House in April with other state representatives from around the country to discuss new policies to address the issue. 

In an interview, Rachelson was quick to say the term “revenge porn” only describes one type of this abuse. “One thing that I’ve learned from going to the White House event was that we really shouldn’t necessarily call it ‘revenge pornography’ because in some cases, the motive is different,” she said. “So it’s not always revenge.” 

Abuse involving deepfake technology and other forms of digital sexual harassment gained more attention after the 2015 law passed, Rachelson said, and she wants to close some of the gaps in the law next legislative session.

Rachelson said the statute’s language around people leaking compromising photographs creates problems for victims. “What I learned were some of the cases that have gotten filed in Vermont, since the law passed, didn’t end up being able to make a guilty conviction because, one, it was not the ex-boyfriend, but his new girlfriend who posted the pictures,” she said, referring to the law’s requirement that the blackmailer be an ex-partner. 

Another case in Vermont involved someone out of state, she said. “We can try to write as many safeguards into it as possible, but Congress also needs to do some legislation in order to protect inter-state situations,” said Rachelson. 

A popular argument in discussions about nude picture leaks is that individuals simply  shouldn’t take and share nude photo, but both Rachelson and Catherine Ducasse, associate director and victim advocate at HOPE Works, which provides services to sexual assault survivors, said that is the wrong way to approach this issue. 

“It’s never a person’s fault for their trust being violated,” Ducasse said. “That is only the perpetrator or the abusers’ fault who shared the pictures nonconsensually. The onus is 100% on the person who shares those pictures nonconsensually.” 

With the implementation of AI in image-based harassment, sometimes the images affect individuals who never even took or shared nude pictures. So-called “deepfake pornography” is a new symptom of hyper-realistic AI image generation, and perpetrators are demanding money or real nude photographs to not share the generated photos publicly. 

“It’s the latest scam,” said Rachelson. “Some people are making a living doing this now. And, you know, it’s important to think about sensible ways to address it.” 

Rachelson said a Florida lawmaker at that April White House briefing was the victim of deepfake pornography. “It’s being used to sort of politically ruin her, in addition to just sort of shame and humiliate her, but they’re not even really photos of her.”

Part of limiting the impact of this issue begins with media literacy, said Rachelson. “It’s more about also empowering young people to know who they’re talking to online and to help them figure out if they’re being scammed or to be thoughtful”

Lots of victims of image-based sexual harassment struggle with mental health issues throughout and following the experience due to the traumatic and personal nature of these kinds of attacks, Rachelson said. 

She said she has heard of instances in which individuals who were closeted had their sexual identity exposed to the public and personal circles unwillingly adults have had their images posted to sites used by their place of employment; and minors have seen images of themselves sent to family members, teachers and classmates.

Vermont Attorney General Charity Clark told Community News Service in a statement that her office is monitoring these technological trends “with concern.”

“AI must not be used to victimize anyone — particularly vulnerable populations, like children,” Clark said.

Clark’s chief of staff Lauren Jandl said their office doesn’t have any data on cases involving AI yet. But “sextortion” — the practice of extorting money or sexual favors from someone by threatening to reveal evidence of their sexual activity — has been on the rise in Vermont, Jandl said, especially cases involving children and social media.

“It seems like that is a national trend,” said Jandl. 

Jandl emphasized that the lack of data doesn’t mean AI-related cyber harassment isn’t an issue in Vermont. “There is always a chance that this could have been reported to local law enforcement, which wouldn’t necessarily then come to the Attorney General’s office’s attention.” 

“It is certainly a very interesting and disturbing issue,” Jandl added. 

Rachelson said there’s a link between cyber harassment and youth suicide attempts. 

“Victims of this .. didn’t feel like they could talk to their parents or anybody about it,” said Rachelson. “I did hear from our attorney general’s office that they’re hearing about cases like this, but even high school students, teens or middle schoolers, it could be anybody.” 

Ducasse shared some of the strategies HOPE Works recommends. “When you’re the victim of this, it feels like it’s out of your control. So really trying to figure out okay, what’s out of my control and what’s in my control and what are my options now, so you can regain some control in some parts of your life, just to kind of be able to move forward.” 

She said counseling could be a good idea, and the group suggests finding techniques to ground yourself.

Rachelson described the phenomena as feeling “so unbelievably hopeless,” but she is optimistic about changing Vermont’s laws on the issue.

“I know that people are going to be interested in being co-sponsors and hearing more,” she said.

Categories: Legislation

5 replies »

  1. I didn’t hear any proposed solutions. It’s honestly strange to hear they consider it a problem at all. If anything, the possibility of deepfakes provides an easy means to wave away evidence of things that actually transpired. “No, that’s not a video of me snorting coke off the breasts of my fellow state senator, that’s a deepfake!”

  2. There is no AI. it’s all just another narrative. There has been no breakthrough in computer technology in the last year. Machines still can’t think. It’s just a narrative. Calm down and turn it off.

    • Rick, As a computer tech and overall science guy I cannot agree more. AI is nothing more than pattern recognition. People make these programs, based on what they want to be done, nothing more. An algorithm is nothing more than this

      “A process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer”
      So says dictionary def.

      So if you are worried about AI , turn those features off, or better yet do not buy stuff with the new crazy splash word A.I.

      All of the innovations in computers are just gimmicks. Nothing has been better , faster or any such thing, just smaller. Pay attention to the actual tech. Still using in one form or another a old 486 133mhz CPU. There may be 100 stacked together but that is all they are. Until a real leap is done like a true way to not have all stuff operate in nothing more than 640K ( yes they still do run in 640K data is just swapped out faster. ) then we will have better systems. Then all they have to do is make something that does not stop working 1 yr after it comes out.

  3. Vermonters should spend less time panicking over technological panaceas and more time repairing its legion of pathological social ills, including nationally high rates of teen suicide, addiction, and child abuse such as the childhood trauma as documented by the DCF as recently as 2019 at Kurn Hattin Homes for Children in Westminster Vermont.

  4. Apparently, they don’t appreciate “AI” sexual abuse as they prefer to do the sexual exploitation themselves. Yes, mutilating the genitals and/or secondary sexual characteristics of under-aged children or pumping their developing bodies with artificial hormones without parental permission is: SEXUAL ABUSE, Vermont lawmakers. I mean, lawbreakers.