Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)G
Posts
0
Comments
275
Joined
2 yr. ago

  • There are few reports of this directly from the industry, because nobody wants to admit talent shortage. It's a much better sell to claim that you pivot towards AI.

    I'm an enterprise consultant for technology executives, and work mostly as a platform architect for a global enterprise. The scale of this issue is invisible to most people.

    I know this is basically "trust me, bro", and I wish I had more to show, but this evolution is in plain sight. And it's not like AI introduced this problem either. I'm old. Still, take my Internet connection away from me, and watch me struggle to figure out if I want .includes() or .contains() on a JS array. There is a scale.

    The problem is that we've reached a point where it's easier to generate a convenient result that communicates well, instead of the "correct" solution that your executives don't understand. Decision makers today will literally take your technical concept from your presentation to have it explained to them by an LLM afterwards. They will then challenge you and your concept, based on their interactions with the LLM.

    LLMs are continuously moved towards a customer-pleasing behavior, they are commercial products. If you ask them for something, they are likely to produce a response that is as widely understood as possible. If you, as a supposed expert, can't match those "communication skills", AI-based work will defeat you. Nobody likes a solution that points out unaddressed security issues. A concept that doesn't mention them, goes down a lot easier. This is accelerated by people also using AI to automate their review work. The AI prefers work that is similar to its own. Your exceptional work does not align with the most common denominator.

    You can't "just Google it" anymore, all results are LLM garbage (and Google was always biased to begin with as well). All source information pools are poisoned by LLM garbage at this point. If you read a stack of books and create something original, it's not generally understood, or seen as unnecessarily complicated. If you can ask an AI for a solution, and it will actually provide that, and everyone can ask their LLM if it's good stuff, and everyone is instantly happy, what are the incentives for developers to resist that? Even if you just let an LLM rewrite your original concept, it will still reach higher acceptance.

    You also must step outside of your own perspective to fully evaluate this. Ignore what you believe about LLMs helping you personally for a moment. There are millions of people out there using this technology. I attended seminars with 100+ people where they were instructed on "prompting" to generate technical documentation and compliance correspondence. You have no chance to win a popularity contest against an LLM.

    So why would I need you, if the LLM already makes me happier than your explanations I don't understand, and you yourself are also inherently motivated to just use LLM results to meet expectations?

    Yes, I know, because my entire enterprise will crumble long-term if I buy into the AI bullshit and can't attract actual talent. But who will admit it first, while there is so much money to be made with snake oil?

  • Nobody pays for that much bandwidth without the ability to manipulate you through profiling and impressions. You are the product. The product is not sharing videos. There is no fediverse platform that makes you its whore. If you were to make a video sharing platform, it would never work, because that is not the product, it's only a feature of what makes up the dopamine machine.

    Lemmy will also never outgrow commercial platforms, because the commercial platforms also never were about content.

  • The sad truth is, we hardly have any software engineers anymore. Trying to find one that is not a prompt monkey has become a serious challenge. Especially new "talent" is a waste of money. You wish it wasn't so, but AI is on par with engineers. Especially when those engineers just end up using LLMs. Even people who want to learn now have a poisoned well where facts are impossible to find

  • Pretty much any word in the picture is contradicted by the Wikipedia article

  • I heard this phrase once: Trying to save the ecosystem with domesticated bees is like trying to save biodiversity by putting up another cattle ranch.

  • Deleted

    Permanently Deleted

    Jump
  • So electric cars don't have valves. Oh, you didn't even think that far ahead with your boomer brain? Try to figure out why they put the warning in the manual. With all that leaded gasoline fogging up the brains, it's fair to assume grandpa drank from a battery on a dare.

  • Operator overloading allows you to redefine what each operator does. It's essential to achieve a truly fucked up code base

  • Meat, air travel, and sugar water. The corner stones of our society

  • The product is spyware by design. It's a honey pot for people trying to save a few bucks, while exposing their entire browsing behavior. They even called it honey...

  • Best to sit your hands if you have greasy palms

  • He's no longer getting tax money for his idiotic projects. Weidel would sell her wife and suck him off if it helps her into power

  • There is zero reason for such conclusions. You could just read her statement

  • You can install serverless frameworks on your server though. Best of both worlds

  • She was the commentary editor, not the one responsible for the piece.

  • Increase how often the drones call the mothership, excellent.

  • “If a target is important enough, they’re willing to send people in person. But you don’t have to do that if you can come up with an alternative like what we’re seeing here,” Hultquist says.

    From the article

  • Das eine schließt das andere vollkommen aus. Es gibt dort Überschneidungen wo Menschen Illusionen haben. Ich sehe immer wieder wie Leute sich um Kopf und Kragen reden, weil sie glauben sie sind kompetent und lassen sich nur Arbeit abnehmen. Ganz oft sind die Leute einfach so thematisch schwach, dass sie glauben diese Arbeit ist überhaupt nötig, anstatt einfach mal den Kern zu erkennen.

    Statt ein Thema erklären zu können, generieren sie seitenweise Bullshit der irgendwie über das Thema redet.

    Statt Algorithmen zur Lösung zu entwickeln, lässt man sich einfach einen Haufen Code generieren der irgendwie das gleiche macht.

    Wenn du wirklich kompetent in einem Thema bist, und dir etwas von LLMs generieren lässt, dann korrigierst du jeden Satz.

    Warum sollten du überhaupt deine Zeit damit verschwenden, wenn du das gesamte Wissen hast?

    Die Leute wollen diesen Feedback Loop nur dann wenn sie eben nicht kompetent sind. Und dann wollen sie sich im Consumer Pleasing Bias des LLM einen runterholen. Das ist alles.

  • Kann sein. Das kann man ja selbst oft schlecht beurteilen wie kompetent man tatsächlich ist

  • Ich habe dein Kommentar auch eher als Aufhänger genommen. Ich wollte eigentlich gar nicht so sehr kritisieren. Danke trotzdem für die Erklärungen. Ich glaube aber du beschreibst trotzdem eher eine Hoffnung, nicht die Realität. Das wollte ich hervorheben