• adarza@lemmy.ca
      link
      fedilink
      English
      arrow-up
      45
      ·
      3 个月前

      81.7 terabytes of books
      (‘more than’, but we’ll use this figure)

      ➗ divided by ➗

      1 mebibyte per work
      (the samples of digital novels i used here to estimate size per work were around 500kb each; i took that and doubled it)

      ✖️ multiplied by ✖️

      $250,000 maximum fine (per infringement)

      🟰 equals 🟰

      $19.48 trillion maximum fine.

      • pivot_root@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        3 个月前

        A fine like that will last about as long as it takes for Donvict FElon to sign an order making training AI fall under “fair use,” unfortunately.

        Or, just as likely, the individuals will be the ones found responsible rather than the corporation.

    • schema@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      3 个月前

      If they get sued and it gets to the discovery phase, they might have to actually provide all training data used, and I’m almost certain there is a lot more stolen content in there. But sadly, it will probably not come to that, because they have money.

    • yetAnotherUser@discuss.tchncs.de
      link
      fedilink
      arrow-up
      1
      ·
      3 个月前

      It’s a win-win either way.

      Although I’m almost rooting for Meta just so that copyright is weakened at all. The damage copyright has done to culture and science might even exceed Meta’s damage to society.

  • BradleyUffner@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    3 个月前

    I really don’t understand how LLM models aren’t considered derivative works of the material they were created from under copyright law.

  • fmstrat@lemmy.nowsci.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 个月前

    I would love to see Meta sued by anyone for this. They’d win, but that could theoretically could be used as precedence for individuals.