Florida Middle Schoolers Arrested for Allegedly Creating Deepfake Nudes of Classmates
Source: Wired
Two teenage boys from Miami, Florida, were arrested in December for allegedly creating and sharing AI-generated nude images of male and female classmates without consent, according to police reports obtained by WIRED via public record request.
The arrest reports say the boys, aged 13 and 14, created the images of the students who were between the ages of 12 and 13.
The Florida case appears to be the first arrests and criminal charges as a result of alleged sharing of AI-generated nude images to come to light. The boys were charged with third-degree feloniesthe same level of crimes as grand theft auto or false imprisonmentunder a state law passed in 2022 which makes it a felony to share any altered sexual depiction of a person without their consent.
-snip-
As AI image-making tools have become more widely available, there have been several high-profile incidents in which minors allegedly created AI-generated nude images of classmates and shared them without consent. No arrests have been disclosed in the publicly reported casesat Issaquah High School in Washington, Westfield High School in New Jersey, and Beverly Vista Middle School in Californiaeven though police reports were filed. At Issaquah High School, police opted not to press charges.
-snip-
Read more: https://www.wired.com/story/florida-teens-arrested-deepfake-nudes-classmates/
Ran across this story thanks to a tweet from the journalist
Link to tweet
and the replies to her tweet included these:
I'm wondering why there's no responsibility for the makers of these apps or sites that distribute them that minors are using.
Arresting a 13 yo for using an app instead of the app creator...
The services who actually generated it should be held accountable as well. They manufactured the material regardless of who prompted it.
MOMFUDSKI
(5,743 posts)The Horrors!
louis-t
(23,309 posts)a pain in the ass.
Midnight Writer
(21,823 posts)If you give porn to a minor, you go to jail, not the kid.
If you let a 13 year old drive your car, you are the one in trouble, not the kid.
Kids have undeveloped decision making skills. That is why sex with a minor is the adult's legal responsibility, even if the minor "consents". That is because the minor is considered, under law, to be unable to knowledgably consent.
A 13 year old gets a software program that allows him to "see" naked people, and he uses it to "see" naked people? Doesn't seem to me that the kid bears full responsibility for this, as reprehensible as it is.
getagrip_already
(14,917 posts)Seems the darker the skin color, the younger you get charged as an adult. But in this case, they are in juvie.
Chances are they will be allowed to plea down or be granted pretrial probation which will go away if they don't reoffend. Assuming this is a first offense.
Stupid f'in kids. They just don't care what this can do to a pre-teen in a social setting. It can be devastating.
SpankMe
(2,972 posts)sir pball
(4,763 posts)Although not at the same time. Point being, it's unremarkable to hold all sides to account, if to different degrees (I ended up with some embarrassment, the senior who gave me the vodka got probation). The kids' lives shouldn't be ruined, but they need consequences perhaps AI nudes of them, with tiny tiny penises?
NanaCat
(1,382 posts)Often faces juvenile detention.
A kid who nicks beer from the local corner shop, and the cops find him passed out somewhere in public--same thing. Happened to one of my sprog's schoolmates.
So it's not a stretch for these hooligans to face juvenile criminal charges for their actions.
orangecrush
(19,655 posts)NanaCat
(1,382 posts)That it can and has happened elsewhere. To wit:
Aussie105
(5,463 posts)Not too hard for someone to stick your head on a nude.
It's called Photoshop.
Fake celebrity nudes have been around for a very long time. (So I've been told.)
Not the makers of AI, Photoshop, or guns that take the blame for their misuse.
The blame lies squarely on those who misuse those tools for nefarious reasons.