Skip Navigation

Posts
0
Comments
67
Joined
3 yr. ago

We change constantly. Not sure what happens tomorrow and the past is something we learn from.

  • @stinkytofuisgood @kalkulat Have ever someone thought that the bad results are a problem of the business model ?

    Why should Open AI ( besides competition with others ) make the AI answering correct and with the minimal amount of tokens , if they get payed on token ?

    I only just had this thought reading this post.

  • @octoham

    i can't agree with this general statement. i guess it's important to understand that a lot of the multiplayer games you wanna currently play have some kind of protection which makes it currently not suitable to run on linux.

    just to give a different view mp games which currently seem to work ( checked protondb ): Eldenring Nightreign Dark and Darker (used to work) Helldivers & Helldivers 2 Arc Raiders Counter Strike Overwatch DOOM in very different versions Quake Live Hunt Showdown 1896 Dead by Daylight Warframe

    with all respect for the games you currently love so much, please don't make general statements.

    A personal advice i want to give: Only change to linux if you are also willing to change yourself and adapt. Linux is not Windows like, Mac OS is not Windows , like a Playstation is not a XBOX or a Switch.

  • @OrgunDonor

    thanks for the answer. i remember there was a scene for sim racing at one point, but i am completly out of this one. Maybe @gamingonlinux has some ideas about racing.

  • @ipkpjersi @v0rld

    I would still like to know which and why. Usually there are alternatives out even for those multiplayer games. Example - all the Valve Games / Blizards Games are good too ;)

  • @TheOakTree

    IMHO it's not the speed. People are patient enough if the result is good. But lets be honest the context windows are damm small to handle local context.Try to summarize things which are bigger than a email or a very small article.Try to have a slightly bigger codebase...

    And specially this "smaller" local llm's have a much more limited quality by default without additional informations provided.

    We also don't wanna talk about the expected prices of DDR5 memory for modern CPU's. So even if you have a AI CPU from AMD or similar most of those PC's won't have 64+GB ram -

    Try of a bigger content windowQWEN3:4b with 256k ctx

  • @RepleteLocum @drmoose

    The LLM's are not run on the gpu but rather on the cpu "AMD Ryzen AI 400" for the higher model and use therefor the system memory.

  • @Ulrich

    I totally agree but not all users will see it the same way ;) You see how often people feel entitled to get some help :)

    I would go even further - if valve supports more hardware and opens up steamos for none business partners ( aka end users ) .. the press might pressure them into things they don't want.

  • @Ulrich @mnemonicmonkeys

    Imho: It's a question of support ... all the named distribution are a community effort in support.

    Valve can't and probably won't try to put themself in a situation where they "must" deliver support outside of well know hardware combinations.

  • @Septian @pet1t

    My take is a little different. If people really want AI they should pay additional cost and AI should be a addon feature on your PC.

    Additional Costs - more powerful AI centered Chips ( as least as possible power consumption ) which can use much much more local fast memory ( imho 256 GB should be in the long term the minimum ).

    That enables local AI's to be the solution for privacy and control of long term costs and i guess in 99% local AI's will do the job fine enough.

    Sadly noone will be on our side cause they want to put AI Usage / PC Usage overall behind a monthly subscription in the long term.

    Right now we are just as always in the phase of making people depending on a technology.

  • @cRaziman i am not into those more control bullethell games , i like my cozy vampire survivors or halls of torment. Btw halls of torment's input customisation might give you some hours too and there is a demo for this on.

  • @IzzyJ @Nibodhika

    I personally think, that if your are gaming on linux you should value valve alot for how much money they have put into the linux ecosystem and it's not bad to buy at their store. On the other hand there is alot of gaming happing outside of steam (including things which won't make it to steam).

  • @FishFace @x00z my small thought - i think today no solution can prevent "cheaters" because you can't differ "cheaters" from users anymore if they want to.

    Here is why -One PC is running the game - a second PC emulates Keyboard and mouse inputs using a CAM (Capture Card) / Sound (microphon / digital capture) and an on the Game trained AI.

    So what does any "cheat protection" offer if they don't protect against serious cheating ?

    PS: "The only still working protection is lan play with control over hardware / software and players like done on real events"

  • @therivierakid @drosophila

    As always on Linux you have different possibilities. Most big Desktop Environment's like KDE / GNOME / Cinnamon .... can mount devices automatically or on a click on the device. No need for additional entries in fstab.

    If you however want a more general approach you can use systemd's automount or a fixed mountpount using fstab.

    Most normal Desktop User's will be totally fine with the DE Solutions.

  • @Auth @floofloofIMHO: Advertisment is another word for recommendation. While advertisment is seen as bad a recomendation isn't.

    So what advertisment never made happen is making themself usefull to the consumer. Most consumer want maybe a !!! usefull recommendation !!! but not someone trying to force you to buy a certain product.

    So what was the time before ads ... it never existet ... even before tv radio had advertisment. Even back in this day people hated the advertisment and did music recordings cutting the advertisment and talking out.

    Some old people might remember press record ... press stop ... rewind a little bit ... and all of this.

    The alternative was to pay alot of money for music ...

  • @themurphy @rigatti There is one difference ... LLM's can't be more efficient there is an inherent limitation to the technology.

    https://blog.dshr.org/2021/03/internet-archive-storage.html

    In 2021 they used 200PB and they for sure didn't make a copy of the complete internet. Now ask yourself if all this information without loosing informations can fit into a 1TB Model ?? ( Sidenote deepseek r1 is 404GB so not even 1TB ) ... local llm's usually 16GB ...

    This technology has been and will be never able to 100% replicate the original informations.

    It has a certain use ( Machine Learning has been used much longer already ) but not what people want it to be (imho).