cm0002@europe.pub to Programming@programming.dev · 1 个月前GitHub: We going to train on your data after allwww.theregister.comexternal-linkmessage-square25linkfedilinkarrow-up1268arrow-down12cross-posted to: technology@lemmy.worldtechnology@lemmy.zip
arrow-up1266arrow-down1external-linkGitHub: We going to train on your data after allwww.theregister.comcm0002@europe.pub to Programming@programming.dev · 1 个月前message-square25linkfedilinkcross-posted to: technology@lemmy.worldtechnology@lemmy.zip
minus-squareShin@piefed.sociallinkfedilinkEnglisharrow-up11·1 个月前Stupid, even naive question. What about AGPL code used in this training? Would that means that the output is also AGPL?
minus-squarePlexSheep@infosec.publinkfedilinkarrow-up8·1 个月前That legal question is not yet clearly answered. I think absolutely yes, but the Megacorps and pro-ai people don’t seem to care.
minus-squarecalcopiritus@lemmy.worldlinkfedilinkarrow-up6·1 个月前They already trained their AIs on basically all the GPL code. They don’t care.
minus-squareShin@piefed.sociallinkfedilinkEnglisharrow-up2arrow-down1·1 个月前Should “we” care? And if so, what we can do about it?
Stupid, even naive question. What about AGPL code used in this training?
Would that means that the output is also AGPL?
That legal question is not yet clearly answered. I think absolutely yes, but the Megacorps and pro-ai people don’t seem to care.
They already trained their AIs on basically all the GPL code. They don’t care.
Should “we” care?
And if so, what we can do about it?