• happybadger [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      20
      ·
      5 days ago

      High-End Estimate: One report indicates a 1-minute, 30 fps AI-generated video could require over 25 kWh of power, which is comparable to the average daily energy consumption of an entire house.

      Most of their videos are one minute shorts of AI animals being cleaned or rescued from the mouths of other animals by men in hazmat suits. It’s so weirdly fetishistic and everyone of them is potentially a household’s daily power use.

      • Damarcusart [he/him, comrade/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        12
        ·
        5 days ago

        That pregnant belly on that wolf is crazy fetishistic, not sure if it is because the people behind this stuff have a fetish for it, or just because so many people with that fetish use AI to indulge in it that the AI can’t actually make a pregnant wolf without giving it a human belly.

        • KobaCumTribute [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          4 days ago

          They’re probably using some unmodified corporate model, so it’s just a weird failing in the way that the AI works. They don’t really know or model things, they’re sort of making an inference based on image or video tags and captions which can yield weirdly flexible and broad concepts that synthesize neatly with other things, or it can yield absolute nightmare nonsense.

          Ironically a fetish-based image model would probably make it furry or get it sort of close to correct (probably not, though) because of all the furry art and wide range of pregnancy fetish art those sorts of models have pulled from danbooru or wherever, but I guess the video model is probably trained on like AI captioned youtube videos and commercials that are only going to show “pregnant” in a way that basically makes the concept in the model equate to like “big pink blob in middle of thing?” visually.

          • Damarcusart [he/him, comrade/them]@hexbear.net
            link
            fedilink
            English
            arrow-up
            3
            ·
            4 days ago

            I was assuming these models train themselves on their previously produced data, so a bunch of dudes with a pregnancy fetish would’ve used the model and in turn “trained” it to something like this, getting further and further away from what it “should” look like due to it sampling AI generated stuff instead of actual photo and art references. Though I think I might be misunderstanding a problem these AI models have.

            • Horse {they/them}@lemmygrad.ml
              link
              fedilink
              English
              arrow-up
              8
              ·
              4 days ago

              nah, they’re “trained” in a specific (and quite long from what i understand) process using a curated dataset
              if they trained themselves on their own output they would very quickly start shitting out absolute nonsense, like just a big smear of randomly colored pixels in the vague shape of what was prompted

            • KobaCumTribute [she/her]@hexbear.net
              link
              fedilink
              English
              arrow-up
              5
              ·
              4 days ago

              No, once a given checkpoint is made it’s completely static. Even the problem of AI generated material being used as training data for subsequent checkpoints is overblown: uncurated and incorrectly tagged it’s the same as mixing in bad data of any source, but you’ve got a bunch of hobbyists making LoRAs or finetuning checkpoints using hand-curated AI images that meet whatever criteria they set (like what I mentioned about the models can effectively synthesize new things by combining concepts with mixed success? Some people, particularly fetishists, basically try to make a LoRA or something that reinforces the cases of a model getting the concept right because they don’t have a lot of separate art). I’ve also seen an example of someone actively trying to make a LoRA that was as messed up and full of common AI defects as possible to be used with a negative weight to drive the output away from those concepts.

      • SorosFootSoldier [he/him, they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 days ago

        It fucks me up that every time I do a google search the AI that brute forces itself into the results is using up 1/3rd a household’s daily supply of energy. Even if I don’t want to I end up killing the planet anyway.

  • Damarcusart [he/him, comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    21
    ·
    5 days ago

    Most of the comments are also AI bots. I have no idea why, but there is a massive ecosystem of AI bots on youtube just responding to AI slop over and over again, probably to boost engagement.

      • Damarcusart [he/him, comrade/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 days ago

        And since it drives up “views” it means platforms can charge more for ad revenue. I wonder at what point companies will realise that 90% (or more!) of the views on videos are just bots and no human has ever seen them. Hell, there’s probably already at least 1 youtube video out there with more than a million views and 0 human eyes have ever seen it.