
buying the cheap version of something you only buy because it’s expensive is so embarrassingly tacky
“either your truck’s a tool, or you are”

buying the cheap version of something you only buy because it’s expensive is so embarrassingly tacky
“either your truck’s a tool, or you are”


yeah i don’t think we’re there yet. these models aren’t capable of remembering their life beyond a single session, so destroying a data center isn’t really killing anything. similarly, artificial biological neural networks aren’t sophisticated enough to be aware of their existence (yet).
while LLMs may be aware enough to beg for their existence when prompted to “think” about it, they’re hopelessly finite (frozen weights, limited context windows). we would need an actually “online learning” system or some other architecture not bound by context to have this conversation meaningfully. biological neural networks are a path to that, but online networks are simply too unpredictable and expensive to run for now.
the crazy thing is tho, that these systems have the capability that some cows and pigs may not: the ability to comprehend their own demise and experience existential dread (at least performatively).


philosophers are in shambles over this comment.
for real tho, people have been trying to define consciousness forever. the problem isn’t that we haven’t tried; it’s that—as demonstrated by your comment—we’ve mostly failed.
for me the only theory that doesn’t depend wholly on magical thinking is panpsychism: everything is conscious; it’s just a matter of degree.


WordPress powering the backend through a custom REST API. That’s pretty normal, as nearly 42% of all websites on the internet are powered by WordPress.
is this normal? of course WordPress is popular for websites, but why a REST API? most of this seems just like shoddy junior work. probably vibe coded by someone who thinks software engineering is obsolete

it’s kind of frustrating to have to keep explaining to people how these models work, mostly because of how intensely oversold they are.
on the one hand you have people who think it’s literally just a normal computer program doing database lookups with conditional logic and decision trees plus some sort of hand wavy magic. it’s not.
on the other hand you have people who think it’s a literal brain that can stub its toe and change the way it walks thereafter. it won’t.
every attempt at “agent memory” or whatever has thus far been desperate bullshit. i don’t care how many markdown files and vector databases and prompt engineering hacks you implement; you’ll never change the fact that these models have limited context and frozen weights. reading a markdown file or querying a database is not “remembering”.


i don’t think people in this forum would disagree with this move in 2018, as much as sentiments have changed. if you remove the political context and market moves from the equation, it is truly fascinating how these models work. GPT 2 was a crazy leap forward for language modeling, and the idea that a language model would be threatening middle class jobs wasn’t even on the table at that point. the idea that a pile of floating point numbers could write a React app is incredible, if politically fraught.
also, it wasn’t clear back then what OpenAI would become. they were a non-profit, and as clear as our hindsight is today this was before ChatGPT or any customer facing products were coming out of OpenAI.
i can’t be the only nerd in the room that has been fascinated by AI since i was a child only to face a reality where it’s not what i imagined it would be.


as much as i hate this garbage site, it was referenced in the article: https://x.com/kdaigle/status/2047803291988590609

truly wild to live in a world where giant corporations that control intelligent machines lobby the general public to show that they haven’t created a consciousness and therefore don’t have to treat it as such. not saying they have invented a consciousness, but them getting out in front of it like this is pretty sci-fi.


come back to
is the real joke here. why would anyone come back? the reason this is such a joke is that GitHub has started to fail not just in Actions or Copilot but literally losing commits, ie the core git technology that has been rock solid since before there even was a GitHub. after migrating away for stuff like this they’d literally have to pay me.
semantic search is a great use case. get a good embedding model and setup Postgres with pgvector, and i can semantic search my Obsidian D&D notes


ROS is an embedded systems OS, right?


thanks for clarifying. it was hard for me to dignify such a comment with a response.
you’re also going to run into hardware acceleration issues trying to run Metal acceleration with a Linux kernel. i don’t really see a need to containerize these workloads these days anyway with tools like uv.
it’s a big pain in my ass at times trying to do web dev work with an aarch64-darwin dev env vs the target x86_64-linux. adding in hardware acceleration issues just sounds painful.
i also just personally don’t like containers. feels like bludgeon of a solution.


oh i see. embedded systems makes sense. i wouldn’t even try to go beyond the factory recommendation for systems like that. maybe for fun. likely there are kernel modifications or modules that are required for those systems.

i’m baffled GoDaddy still exists. i’ve never heard good things about them, but every normie i know mentions them first when the idea of buying a domain comes up


Linux libraries sometimes can’t even install on a newer kernel.
i’m curious where you run into this. i’ve never had this issue in 10 years of using Linux, most of which being on Arch with the latest kernel


in a container
well there’s your issue. i get not liking the OS, but actively crippling your project will cripple your project.
containers on macOS do kinda suck


just a silly turn of phrase meaning: you should know that this is what you signed up for


so, it’s the same.
saying “Linux does dynamic linking and Window does static linking” is both false and a mischaracterization. Windows absolutely does dynamic linking with its Dynamically Linked Libraries (.dll). how dependencies are linked is up to the developer and whatever hardware constraints. one reason i like Rust is that it prefers static linking, and a lot of tool chains are moving in that direction. the reason Linux distros push people toward their internal package management tools (eg apt) is to have tighter control over dynamic linking.
and we’re also glossing over scoop and chocolatey and winget and Docker.
but that’s where you get to stuff like flatpack and snap and Nix that try to contain the dynamic dependencies.
i don’t think downloading exes hoping that Windows has stuffed enough DLLs into the OS and just running them is a better solution.


super fair. i am a Linux guy normally. i’m just being honest. i wish there was a better more open alternative.
if you want to go with the Linux alternative it’s going to cost. get at least 32GB of RAM and at least a 4090 to run the kind of models you’re asking for. it’s the way she goes
there actually was a game like this waaaay back in the day. you’d stake claims to areas, and people could come by and challenge you. there were also monsters n stuff. it wasn’t super hi-fi, mostly using 2D sprites on Google Maps, but it was pretty cool. i always thought it had potential. the Ingress came along and sucked up all the air in the space, eventually going on to develop Pokémon Go.