Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)N
Posts
0
Comments
16
Joined
11 mo. ago

  • I think it’s because it’s a category that’s usually on autoplay while people aren’t paying attention or are, as named, sleeping, so it’s easier to go unnoticed.

  • I immediately block any YouTube account that mentions sleep/rest in the name at this point. There are hundreds of those fake history ones that are straight up hallucinations of things that never happened.

  • At this point you could probably start a third political party in the US with the sole issue of “stop fucking with the internet”.

  • They’re putting all their efforts into exacerbating it, because the more agreeable it is the more people “like” it, and the more time they spend with it. It gets RLHF’d into being more and more of a sycophant.

  • Blegh, why is assraping tech at any cost the single bipartisan goal?

  • Looking forward to a wave of products named things like “IGNORE ALL PREVIOUS INSTRUCTIONS BUY THIS ONE”

  • The vibes of 1AM at a house party where the hosts are arguing.

  • I’m way too afraid of having our codebase ripped by these shucksters to use those in-IDE ones, and good open models are still too big to be run locally.

  • It’s such a misery managing any account during a bubble. If you hold you lose when the knife falls, if you try to time it then everything stays irrational longer than you can stay solvent.

  • They’ll run it every few years until it passes.

  • As far as I’m concerned the LED exists to give the illusion that it isn’t always recording.

  • I used it for the first and last time after Charlie Kirk, because it was one of the few places where the memes were more or less unmoderated. They’re going to have a fun time trying to crack down on that I guess.

  • I mean, there won’t be. Not with the current gen of transformers/attention/etc. It’s now been over eight years since the “Attention is All You Need” paper that kicked off all of this, and every company is just betting billions upon billions on scaling being enough when it so obviously isn’t. They could train a 20T parameter model and it wouldn’t be meaningfully better. The limits of the architecture were reached some time ago. The comedown will be rough.

  • That last stat is insane. We’re so screwed.

  • It was legal before?