• corroded@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    6 months ago

    The problem isn’t the rise of “AI” but more so how we’re using it.

    If a company wants to create a machine learning model that analyzes metrics on an automated production line and spits out parameters to improve the efficiency of their equipment, that’s a great use of the technology. We don’t need a LLM to produce a useless summary of what it thinks is a question when all I want is a page of search results.

    • FiniteBanjo@lemmy.today
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Thats fucking bullshit, the people developing it and shipping it as a product have been very clear and upfront about their uses and none of it is ethical.

          • 3ntranced@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            6 months ago

            One might argue it’s killing more people in the past 30 years than all guns have throughout history

        • herrvogel@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          Guns are made to kill. When someone gets killed by a gun, that’s the gun being used for the thing’s primary intended purpose. They exist to cause serious harm. Causing damage is their entire reason for existing.

          Nobody designed LLMs with the purpose of using up as much power as possible. If you want something like that, look at PoW crypto currencies, which were explicitly designed to be inefficient and wasteful.

          • baggachipz@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            Ahh, see, but the gun people don’t say it’s solely to kill. They say it’s “a tool”. I guess it could be for hunting, or skeet shooting, or target practice. One could argue that they get more out of owning a gun than just killing people.

            But the result of gun ownership is also death where it wouldn’t have otherwise occurred. Yes, LLMs are a tool, but they also destroy the environment through enormous consumption of energy which is mostly created using non-renewable, polluting sources. Thus, LLM use is killing people, even if that’s not the intent.

            • herrvogel@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              6 months ago

              Difference remains whatever people claim. Guns are weapons made to cause damage first and foremost, and tools second. LLMs are tools first and whatever else second. You can un-dangerousify a tool by using it properly, but you can’t do that with a literal weapon. Danger and damage and harm is their entire reason to exist in the first place.

              • baggachipz@sh.itjust.works
                link
                fedilink
                arrow-up
                1
                ·
                6 months ago

                Good point, but I also think that the intent does not necessarily affect the result. BTW I also think guns shouldn’t be a thing, unless under very strict circumstances (military, licensed hunters). I also posit that the use of unlicensed LLMs in the general public is proving to be irresponsible. That is to say, a specific and worthy use case should be established and licensed to use these “AI” tools.