all 7 comments

[–]Drewski[S] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

[–]x0x7 1 insightful - 2 fun1 insightful - 1 fun2 insightful - 2 fun -  (2 children)

Don't expel kids for doing something you know they are going to do.

[–]Bitch-Im-a-cow 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

Kids should know that sharing nudes - fake or otherwise - of 13-year-olds is serious sexual abuse, bullying, and a felony in most places. There's no reasonable argument for allowing those kids back to the same school, where their abused victims are. Kids who suffer that form of assault experience a form of trauma that lasts a lifetime.

[–]HiddenFox 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

They knew what they were doing was wrong. They deserve to be punished. Actions need to have consequences. We all knew that when we were 13.

[–]BumBumCock 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (1 child)

AI technology, no one is safe these days.

[–]x0x7 2 insightful - 2 fun2 insightful - 1 fun3 insightful - 2 fun -  (0 children)

Or everyone is safe if we get our heads out of our asses and stop imagining harms.

[–]HiddenFox 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

Fake nude images and fake pornographic videos overwhelmingly victimize women and girls, and such material is easily searchable on major social media platforms and search engines.

This is the part I don't get. If someone posts an inappropriate image of an underage child how is it not removed? How is the account and IP address not give to law enforcement? At least a few times a year I see a news report of someone being charged locally with kid pics on the computer. Law enforcement is clearly working on this stuff. The last report here was a grandfather posting pics of his granddaughter online that were photoshopped. (Sounded like he cropped and pasted her face on the body of someone else.)