Skip Navigation

Posts
4
Comments
259
Joined
8 mo. ago

Just a dude on the internet, looking for content and fun! I love Linux, gaming, writing, reading, music, anime, walks, and occasionally movies too. Chronically ill and anxious too, that makes life quite interesting...At times.

  • Users needed to be able to choose if they wanted those LLM features from the very beginning, opt-in is the only sane way Mozilla could've handled this push towards LLM integration in Firefox. You are being "All Hail Corporate" by refusing to hold Mozilla accountable for their user hostile behavior. In this case, insane defaults and not respecting user choice is the bad behavior on the part of Mozilla. The only respectful choice would've been allowing the user to reject LLM features when first starting Firefox. A kill switch cannot be considered enough in this case and never should be!

  • Still not the point, if Mozilla wanted to implement LLM features, they needed to do it right! Opt-in is the ethical and user respecting way to handle it. We can all use Forks, but if you let a corporation get away with insane defaults...It will get worse.

  • They knew this would be a deeply unpopular bill, which is why they refused to open it up to public input...This sets a dangerous precedent for further discriminatory laws. All because the chucklefucks that run Kansas legislation in my home state are fucking bigots and wanted some brownie points from a dementia suffering pres and his vocal minority. Disgusting, I don't want my neighbors or Kansas citizens being fucked over like this, it pisses me off on a scale that has no number.

  • It's amazing how robust, but, also how fragile open source is...It's something that needs to be looked at carefully to insure the health of open source. It makes me want to learn and become able to contribute effort.

  • LLM features should be opt-in by default, full stop. Just turning it off without being given a choice to refuse is very scummy behavior on the part of Mozilla. What if they wanted to use the default Firefox? Shouldn't a user be allowed to opt-in and have LLM features off by default on every fresh install?

    It's nice having options that respect your freedom of choice and don't force their deluded ideas upon you...I feel that mainline Firefox, made by Mozilla should do that as well! By not holding Mozilla to higher standards, you get another Microsoft Edge or Chrome situation all over again, this time in open source spaces.

  • Any LLM features should've been opt-in by default if Mozilla actually gave a shit, asking the user if they wanted that useless bullshit before installing. As this technology isn't polished and can introduce vulnerabilities due to it's inherently insecure nature. A kill switch is useful if a user decided that LLMs weren't it and wanted to disable everything wholesale at the click of a button; only after they originally consented to LLM features being enabled.

    Mozilla are only adding this feature because users made their LLM by default installation look pretty grim, I was one of those many dissenting voices about that. They wanted to jump on the LLM hype and cash in on some techbro attention, not considering that some of their user base would outright reject the idea.

  • I'm honestly not surprised because Microsoft can't manifest a win (even though most users know what could be done), they are immune to thinking out of the box...They are so high on fumes of burning insane amounts of money on AI. That will fail, and will be egg on those fuckers face!

  • I wouldn't even use ChatGPT for free! Let alone pay for a thing that will confidently surface misinformation, disinformation without any safety rails. Worse their lying software would even try to isolate me, warping my worldview just for a crumb of engagement...Disgusting.

  • Ah shit, I forgot to turn both those off. Ugh, I've been playing browser shuffle...Good catch!

  • SearXNG is a mixed bag to be honest...I find that Ononoki's SearXNG delivers decent results...It's a bit mixed at times with PieFed being the third option. Some topics are surfaced better than others, so your mileage will vary with SearXNG.

    Using the No AI version of DuckDuckGo sends them a message that their bet on "AI" isn't going to win them any accolades. It's an easy way to push back. As they are tracking those metrics and people can see the numbers of how many users disapprove of AI.

  • LOL you shouldn't speak for everyone using 'we' is a bit much, as I hate AI and what it represents. I would love it to see the whole AI thing fail so hard, that it goes back to be what all the science fiction stories warned us about. The torment nexus shouldn't be given an quasi-existence in this world. Currently there is no AI in this world, there are only gassed up LLMs that the techbros misrepresented the capabilities of and dubiously called "AI" to improve their odds of fooling the masses.

  • I personally don't think there is an ethical way of transforming LLMs into AI, at least not yet. As unfortunately there are too many complications with how it's being peddled and the great slop impact that is hurting open source projects. I do think that companies engaging in machine learning and LLM development need to be heavily restricted and forced to comply with laws that will protect human jobs, human made content, websites, and software projects from their activities. Data stealing crawler bots need to be especially regulated, preferably out of existence, as they essentially DDOS websites that they crawl while stealing data from creatives that host websites and blogs.

    Given that humans can reshape their environment in drastic ways, I think humanity needs to our dial collective focus on being more in harmony with nature and less fighting against it. We can do that by better understanding the world, by studying the oceans, fully mapping them in the least intrusive way possible. We have to carefully consider the impact that human activities have on our only home world, when new ideas are being considered. I think an ethical approach would be that technological progress has a positive impact on both the natural world and still improve the human condition.

  • Ugh, I use Bitwarden, and Starship might be enabled on Aurora (the atomic distro that I use) good to know what I have to try to fix this weekend.

    I hate that some open source peeps are adopting LLM nonsense, as its the enemy of open source...A perfect weapon for corporate types and deluded techbros to attack open source projects.

  • You're welcome, we have to fight the good fight! Technological advancement built on the backs of human exploitation and destabilization of the environment isn't it; I'd prefer we take another direction of researching and optimizing solar, wind, and low emission power generation. I think humanity can advance without harming one another, we just have to find a better path forward. The way LLMs are being utilized at the moment is a harmful scam, we need to call it out more to reduce their ability to fool the public (who so far isn't convinced by what the techbros are saying).

  • Mmm, I figured your opinion would line up with the line of thinking techbros tend to share. It was nice talking to you, have a good day.

  • I think the most important part is that you are having fun with Arch, as one's OS should fulfill your needs and wants! If I get the wild hair to change my current distro, I'll give archinstall a shot after reading a lot about it.

  • Given the nature of most executives, rich fucks, I can't see anything but dystopian coming from these chuckleheads. If you read their unfiltered and uncensored thoughts (often between the lines of their flowery words), how they see others, and notice their detachment for the everyday person. It's pretty grim. Makes me wonder what you see.

  • I might consider it some day...If I get desperate to try Arch. It is an effective user filter just like the "user friendly" Arch distros are, and it does work well from what I've read. Except it does have some scary stuff that can happen if you really aren't paying attention like manual partitioning can...Get hairy. Not as horrifying as Linux from Scratch, but not a pleasant 15 minute max install experience of the distros I tend to favor.

  • Corporations are already trying and failing to do anything useful with gassed up LLMs, and I highly doubt that it will become useful in the next 5 to 10 years. However, what I do feel will happen is that all the hope posting and gaslighting by the shills, those who's success rides on massive adoption will do their best to convince everyone to use the shit. Who knows how well that will actually turn out, it's an open question at the moment. I hope all their efforts leads to failure, personally.

    If gassed up LLMs do seep into the gaming industry to the point I can't avoid it, then I just stop buying games. Spending money on slop isn't ideal in my opinion. I have a backlog of games that haven't been tainted by baby's first lying software, so it isn't a total loss. I got years of games to play before choosing another hobby altogether, I have adhered to some boycotts for a long amount of time in other areas. Having the will to research products and spend money only on those that don't offend my sensibilities is natural to me.

    I already won't use Windows because Microslop chose to go hard in the LLM direction and look how well that is going for them. On top of it, I make use of open source software and support projects so that they can continue to thrive, avoiding anything that is stained with enshittification.

    My concerns with gassed up LLMs that techbros call "AI" are numerous: Ecological effect, Quality of Media (games, art, movies, etc), how it affects the people who provide us with software and entertainment in general. Gassed up LLMs are being used as an excuse to lay off so many people in tech and gaming spaces. As corporations irresponsibly hired in a boom period and realized they'd have to pay people in so called lean times that aren't actually lean. The myth of infinite growth has scrambled the brains of the C-Suite and distracted toddler investors, so the profits don't look so good these days. As they are unwilling to cut off the CEOs who leech off a lot of corporate profits first, naturally the very people who make the stuff that enriches companies, execs, and shareholders...Are laid off first. Thrown into a rabid job market that is falling the fuck apart because rich fuckwits are creating a situation that serves to unilaterally fuck over those that don't have a financial cushion to fall back on.

    This will have a profound impact on the quality and selection of media...Humans are naturally creative, we've painted in caves and told stories for most of our life history. Storytelling and creativity is an intrinsic part of humanity, stories feel better when written by someone with that strong ability to tell a compelling tale! Gassed up LLMs mass produce sterile, meaningless slop comprised of stolen training data that doesn't classify as art to me...Humans do to, but not on an industrial scale like an LLM can. Already, there are sites and instances even on the fediverse dedicated to sharing slop...It's so bland. I can't imagine staying interested in buying stories and art produced by slop generating LLMs. I'll just write my own stuff...Learn how to draw things that exist in my head, that would be a better use of time and money. Creating feels so good, only a talentless techbro or hollow CEO would want to take that away from people to make a quick dollar.

    LLMs in their current form use so much water and power, it's honestly scare how they'll have to rely on dirty forms of energy production just to keep up with escalating demand. That will have an impact on the local environment and marginalized communities initially and if enough of these irresponsible data centers are created...The scale of environmental and community impact is going to escalate.

    We only have one home planet, imagine it being so sullied that human life can't exist here anymore. To be honest, due to our current antics, I don't even know if we'll beat "The Great Filter" and become a sufficiently advanced civilization. We might destroy ourselves first, but at least nature would probably recover after we're gone and our works eventually stop interfering with the ecosystem.

    I don't feel humanity should be concerned with AI as there is so much we don't know about ourselves yet. We lack the sufficient understanding of neurology, why consciousness manifests, and how to create machinery that can actually mimic our brain structure. It would take so many generations of humanity, untold amounts of funding, multi-discipline research to produce a true AI. Techbros wanna shill now though, so we are stuck with gassed up LLMs that will probably cause society to collapse or the rich to finally get put to a French revolution style end (if disenfranchised people are feeling spicy enough).

  • Games @lemmy.world

    Finished Dying Light: The Beast, it was a good game.

  • Just Post @lemmy.world

    If Arch wouldn't work for me, perhaps another atomic install known as Aurora would work! (yep, nerd post)

  • Just Post @lemmy.world

    I thought I was going to be one of those, "I use Arch btw" types. Didn't work out LMAO

  • Just Post @lemmy.world

    Starting my day...