Isn't CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?
- Posts
- 193
- Comments
- 409
- Joined
- 3 yr. ago
- Posts
- 193
- Comments
- 409
- Joined
- 3 yr. ago
- JumpNSFW
Lie Detector
There are people who called the main lemmy.ml instance a community of tankies, so I'd take a lot of these claims with a grain of salt.