Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)F
帖子
11
评论
438
加入于
2 yr. ago

Independent thinker valuing discussions grounded in reason, not emotions.

I say unpopular things but never something I know to be untrue. Always open to hear good-faith counter arguments. My goal is to engage in dialogue that seeks truth rather than scoring points.

  • The current events are nothing new - social media is.

  • Social media is what happened. Consuming negativity all day every day makes people angry and depressed - who would've thought.

  • A major truck stop chain in Finland just recently started doing exactly that.

  • I've been using similar, but slighly different ones that I assume do the same thing. You can also copy the filter and just replace the filter word rather than including them all in one. Naturally here the instance name must be also changed to that of yours.

    lemmy.world##div.post-listing:has(span:has-text("/trump/i"))

    And for filtering comments:

    lemmy.world##article.comment-node:has(div.comment-content:has(p:has-text(/trump/i)))

  • This is happening to me constantly as well. I have no idea how someone thought that was a good idea.

  • Age of consent in Germany is 14 but for someone over 21 years old it's 16.

  • Haven't phones been getting slimmer and slimmer for the past 10 years or so?

  • I'd honestly say that post-processing is my strenght and even though that alone isn't exactly a photography technique I guess that I somewhat master the ability to take photos in a way that I can later make them "pop" with editing.

    I guess another way of saying that is that I've mastered underexposing photos.

  • One million invested to the stock market pays you on average 70k a year in interests. Now imagine having a billion on the stock market.

  • Compounding interests. I don't think that most people realise how powerful the effect of it is. Anyone can take advantage of it but there's no getting around the fact that the more money you have the easier it gets to make even more which then makes it even easier and this just keeps accelerating.

  • Yeah, and I think Buffalox agrees aswell. We were simply talking past each other. Even they used the term "depictions of CSAM" which is the same as the "simulated CSAM" term I was using myself.

  • For me, this was at no point about the morality of it. I've been strictly talking about the definition of terms. While laws often prohibit both CSAM and depictions of it, there's still a difference between the two. CSAM is effectively synonymous with "evidence of crime" If it's AI generated, photoshopped, drawn or what ever, then there has not been a crime and thus the content doesn't count as evidence of it. Abuse material literally means what it says; it's video/audio/picture content of the event itself. It's illegal because producing it without harming children is impossible.

    EDIT: It's kind of same as calling AI generated pictures photographs. They're not photographs. Photographs are taken with a camera. Even if the picture an AI generates is indistinguishable from a photograph it still doesn't count as one because no cameras were involved.

  • Please tell me what own fact/definitions I'm spreading here. To me it seems like it's you whose taking a self-explainatory, narrow definition and stretching the meaning of it.

  • I already told you that I'm not speaking from legal point of view. CSAM means a specific thing and AI generated content doesn't fit under this definition. The only way to generate CSAM is by abusing children and taking pictures/videos of it. AI content doesn't count any more than stick figure drawings do. The justice system may not differentiate the two but that is not what I'm talking about.

  • Being legally considered CSAM and actually being CSAM are two different things. I stand behind what I said which wasn't legal advise. By definition it's not abuse material because nobody has been abused.

  • It's not legal advice I'm giving here.

  • What's blatantly false about what I said?

  • First of all, it's by definition not CSAM if it's AI generated. It's simulated CSAM - no people were harmed doing it. That happened when the training data was created.

    However it's not necessary that such content even exists in the training data. Just like ChatGPT can generate sentences it has never seen before, image generators can also generate pictures it has not seen before. Ofcourse the results will be more accurate if that's what it has been trained on but it's not strictly necessary. It just takes a skilled person to write the prompt.

    My understanding is that the simulated CSAM content you're talking about has been made by people running their software locally and having provided the training data themselves.

  • I don't see how this has anything to do with federated platforms. I'd argue that watching Loops is just as bad for one's mental health than TikTok is.