Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)E
Posts
1
Comments
208
Joined
2 yr. ago

  • Fucking hypocritic idiots.

  • Stupid regulation, honestly. Exact matches are implementable but further than that... Aren't they basically banning e2ee at this point?

    Now I see why Signal will close in EU.

  • My bad. But that phrasing is super stupid, honestly. What company would want to promise to detect new child sex abuse material? Impossible to avoid false negatives.

  • Source? Does the law require that? That's not my impression.

  • My guess was that this law was going to permit something as simple as pixel matching. Honestly I don't imagine they can codify in the law something more sophisticated. Companies don't want false positives either, at the very least due to profits.

  • Is there a source stating that they're going to require these?

  • If you have pre-trained model or a classical voice matching algorithm as the basis, few samples might suffice.

  • They say they the images are merely matched to pre-determined images found on the web. You're talking about a different scenario where AI detects inappropriate contents in an image.

  • They say they the images are merely matched to pre-determined images found on the web. You're talking about a different scenario where AI detects inappropriate contents in an image.

  • Yes, I agree it is dangerous. I just wanted to assess the actual threat (current and future) before jumping onto the wagon.

  • Most clients don't understand art or graphics to begin with, I guess. They just wanted someone good at Illustrator.

  • Article 10a, which contains the upload moderation plan, states that these technologies would be expected “to detect, prior to transmission, the dissemination of known child sexual abuse material or of new child sexual abuse material.”

    This is what I guessed the other day when a post here didn't clarify what the censorship meant.

    While I'm not a fan of this stupid regulation, it doesn't sound like being the armageddon that turns e2ee into ashes.

    (Given that Signal doesn't like it, I might be wrong though.)

    As long as we trust, say, Signal, it will possibly be able to do the scan without sending a good chunk of the image data that the user is sending. URLs can be hashed before sending it to the scanner.

    The remaining piece for privacy is to use open source and to guarantee that the binaries are free of modification from the original. This problem always existed on the Apple ecosystem btw.

  • I think proton was never going to function as a profit-first business. Too many enshittified rival businesses. Kinda the natural outcome.

  • I think the quality definitely degraded, but that's exactly what capitalism wanted. It's going to darwin a big chunk of us through climate change that's accelerated by the electricity needs.

  • This is starting to resemble Putin's rhetoric against NATO since he invaded Ukraine

  • My point is, sacrifices can be made. Even professionals can do it.

    You mean like, they risk losing their job, reducing their profits significantly during the training period, and then likely there are a few algorithms that don't exist in Krita, and most are slower with less optimization. If Adobe releases a new killer feature those professionals who transitioned to OSS are fucked, and also they sacrifice a significant of time on additional training for using Linux, replacing their professional NVIDIA GPUs, tweaks wayland, then they spend time on fixing boot problems, their printers don't work anymore, they have compatibility issues with everything Adobe and MS Office, lose business competitions just because their files can't be opened by Windows, etc. etc. I'll trust you Linux-is-easy people after you converted a few Windows / Apple / Adobe-dependent enterprise businesses.

  • Means nothing to Recall.

    His testimony comes after Microsoft admitted that it could have taken steps to prevent two aggressive nation-state cyberattacks from China and Russia.

    According to Microsoft whistleblower Andrew Harris, Microsoft spent years ignoring a vulnerability while he proposed fixes to the "security nightmare." Instead, Microsoft feared it might lose its government contract by warning about the bug and allegedly downplayed the problem, choosing profits over security, ProPublica reported.

  • Microsoft CEO Satya Nadella is now personally responsible for security flaws.

    I say BS.

  • Maybe we'll all become AI training data and receive universal basic income.