For the record, I was a stupid kid who primarily wanted to talk about video games … I noped the fuck out pretty hard after I realized how truly deranged most of the people/content on 4chan was.
… and obviously after the uh, FBI cooperation message.
Sorry to double post but this is potentially relevant:
So, yeah, the FBI maintains a large database of CSAM, for their… CSAM content ID style system.
People who know that that exists have occasionally pointed out that… that is kind of weird, lets just put it that way.
Well, I’ve just had a horrifying realization.
We currently have AI/LLM corpos on the record going to places like AnnasArchive to acquire their vast troves of data to use in training an LLM.
… If somebody hooks up this FBI CSAM database… into an LLM… well, you now basically have a machine that produces CSAM, and likely also gore/torture videos.
… hooray …
Like uh, Pete Hegseth just … you know, hooked up Grok/XAI into… apparently I guess all of the US’s military systems.
The scenario I am describing is unfortunately plausible.
deleted by creator
I updated my comment with a lot more info.
For the record, I was a stupid kid who primarily wanted to talk about video games … I noped the fuck out pretty hard after I realized how truly deranged most of the people/content on 4chan was.
… and obviously after the uh, FBI cooperation message.
Sorry to double post but this is potentially relevant:
So, yeah, the FBI maintains a large database of CSAM, for their… CSAM content ID style system.
People who know that that exists have occasionally pointed out that… that is kind of weird, lets just put it that way.
Well, I’ve just had a horrifying realization.
We currently have AI/LLM corpos on the record going to places like AnnasArchive to acquire their vast troves of data to use in training an LLM.
… If somebody hooks up this FBI CSAM database… into an LLM… well, you now basically have a machine that produces CSAM, and likely also gore/torture videos.
… hooray …
Like uh, Pete Hegseth just … you know, hooked up Grok/XAI into… apparently I guess all of the US’s military systems.
The scenario I am describing is unfortunately plausible.
You may have missed how there’s a massive CSAM problem with AI because the models are all trained on CSAM. “Plausible” is a bit behind the times.
Uh… yes, I apparently did miss that.
… what?
How… did that happen?
Can you link an article or something that substantiates that?
https://pulitzercenter.org/resource/how-we-investigated-epidemic-ai-generated-child-sexual-abuse-material-internet
https://www.engadget.com/ai/california-is-investigating-grok-over-ai-generated-csam-and-nonconsensual-deepfakes-202029635.html
There’s a whole media cycle on it from the last few months.