Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)H
Posts
0
Comments
35
Joined
1 yr. ago

  • But spending a lot of processing power to gain smaller sizes matters mostly in cases you want to store things long term. You probably wouldn't want to keep the exact same LLM with the same weightings and stuff around in that case.

  • Ye but that would limit the use cases to very few. Most of the time you compress data to either transfer it to a different system or to store it for some time, in both cases you wouldn't want to be limited to the exact same LLM. Which leaves us with almost no use case.

    I mean... cool research... kinda.... but pretty useless.

  • Ok so the article is very vague about what's actually done. But as I understand it the "understood content" is transmitted and the original data reconstructed from that.

    If that's the case I'm highly skeptical about the "losslessness" or that the output is exactly the input.

    But there are more things to consider like de-/compression speed and compatibility. I would guess it's pretty hard to reconstruct data with a different LLM or even a newer version of the same one, so you have to make sure you decompress your data some years later with a compatible LLM.

    And when it comes to speed I doubt it's nearly as fast as using zlib (which is neither the fastest nor the best compressing...).

    And all that for a high risk of bricked data.

  • Shoot this fucker in the face already

  • Naja den Schaden hat die Regierung davor angerichtet, der Scholz hat danach einfach nur nichts dran geändert. Zu sagen der Scholz hätte den Schaden angerichtet würde bedeuten dass die Bundeswehr vor 3 Jahren in einem völlig anderem Zustand gewesen wär, was absolut nicht der Fall ist. Wir haben halt seit Jahrzehnten nichts in der Regierung was mal irgendwas sinnvolles machen würde. Wundert dann leider kaum dass so widerliche Parteien wie die AfD Zuspruch gewinnen...

  • I would appreciate stripping citizenship for racism or fascism related crimes.

  • Programmers can double their productivity and increase quality of code?!? If AI can do that for you, you're not a programmer, you're writing some HTML.

    We tried AI a lot and I've never seen a single useful result. Every single time, even for pretty trivial things, we had to fix several bugs and the time we needed went up instead of down. Every. Single. Time.

    Best AI can do for programmers is context sensitive auto completion.

    Another thing where AI might be useful is static code analysis.

  • Why would I need AI for that? We should really stop trying to slap AI on everything. Also no, I'm not that big of a fan of wasting energy on web crawlers.

  • Yes I agree on that. A lot of people write "C with classes" and then complain...

  • "americans" is a bad name but it's more specific than "united statesians". But I would fully support dissolving that country and founding a new one (or multiple) with a better name.

  • I'm a full time C++ developer, mostly doing high performance data processing and some visualization and TUI tools, and as someone loving C++, it's not as simple as you frame it. In sufficiently complex code you still have to deal with these problems. Rust has some good mechanisms in place to avoid these and there are things on the way for c++26 though.

  • But if your tool chain is worth anything the size of each binary shouldn't be bigger. To oversimplify things a bit: it's just #ifdefs and a proper tool chain.

    In the web development world on the other hand everything was always awful. Every nodejs package has half the world as dependencies...

  • No, because forking a distro and updating some hundred thousands of PCs is not done in a week.

    Edit: and why would we go with Ubuntu...

  • Living under a rock eh?