• Coskii@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    I’ve said it many times, but the channels I speak through are small, so from the top!

    If you put your artwork online in any public location, make sure your signature or even a QR code is obnoxiously large and centered on the image. Humans can still see and enjoy what you’ve made, AI won’t be able to discern anything, and if it happens to get ripped by one of those Chinese T-shirt bots, at least anyone who buys will know who the original artist is.

    • jsomae@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      5 months ago

      TIL that there exist people who aren’t bothered by obnoxious watermarks superimposed on an image. I find them aggravating, and I’m not the only one – That’s shutterstock’s entire business model.

      AI is already making people’s lives worse. Let’s not make human art harder to enjoy in a fruitless effort to resist it. Instead, let’s solve the root of the problem.

    • FiveMacs@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Hey chatgpt or whatever ai model, recreate this image without the silly QR code.

  • Zerush@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    5 months ago

    For image tracking it’s enough to use Imgur for sharing, for any image, even own ones, no AI image needed. I miss the bot in Lemmy which redirects Videos to Piped, when Imgur is worst. Better alternatives, like File Coffee or Vgy.me, made in the EU are desirable.

  • GrappleHat@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    5 months ago

    I’m very skeptical that this “model poisoning” approach will work in practice. To pull it off would require a very high level of coordination among disparate people generating the training data (the images/text). I just can’t imagine it happening. Add to that: big tech has A LOT of resources to play this cat & mouse game.

    I hope I’m wrong, but I predict big tech wins here.

    • VeganCheesecake@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      One thing I was kinda wondering about - as long as there’s nothing in the T&Cs of your instance, don’t you implicitly hold the copyright to your comment? Isn’t the CC license actually more permissive? Or is it more about “that model was trained on content available under this license, to comply with it, they have to follow it’s terms”?

      • onlinepersona@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        5 months ago

        Or is it more about “that model was trained on content available under this license, to comply with it, they have to follow it’s terms”?

        Close. Creative Commons is a copyleft license with restrictions. The important restriction in this case is not allowing commercial use.

        Anti Commercial-AI license

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    5 months ago

    No, because a method that works on one implementation almost certainly doesn’t work on another.

  • General_Effort@lemmy.world
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    5 months ago

    This doesn’t have anything to do with tracking. This is supposed to sabotage free and open image generators (ie stable diffusion). It’s unlikely to do anything, though.

    Hard to say what the makers want to achieve with this. Even if it did work, it would help artists just as much, as better DRM would help programmers. On its face, this is just about enforcing some ultra-capitalist ideology that wants information to be owned.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 months ago

      I see it as trying to combat the dystopia where not only is our data scraped but now every single thing we write, draw or film is fed into an AI that will ultimately be used to create huge amounts of wealth for very few, essentially monetizing our very existence online in a way thats entierly unavoidable and without consent.

      In addition its entierly one way, google and others can grab as much of our data as they want while most of us would have an extremely hard time even getting granted a freedom of information request about ourselves, let alone grabbing a similar amount of data about those same corporations.

      • General_Effort@lemmy.world
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        5 months ago

        that will ultimately be used to create huge amounts of wealth for very few,

        But… That is what these poisoning attacks are fighting for. They are attacking open image generators that can be used by anyone. You can use them for fun or for business, without having to pay rent to some owner who is not lifting a finger. What do you think will happen if you knock that out?