created by deletor
created by deletor
deleted by creator
I can tell from some of the pixels and from seeing quite a few shops in my time.
Us Professors of History?
When I was a little girl I thought that everything, all the abuse and neglect, it somehow made me… special. And I decided that one day I would write something that would make little girls like me feel less alone. And if I can’t write that book…
…if I don’t, that means that all the damage I got isn’t good damage, it’s just damage. I have gotten nothing out of it, and all those years I was miserable was for nothing. I could’ve been happy this whole time and written books about girl detectives and been cheerful and popular and had good parents, is that what you’re saying? What was it all for? - Diane Nguyen, BoJack Horseman, S06E10, “Good Damage”
Yeah, but they encourage confining it to a virtual machine with limited access.
Logic and Path-finding?
deleted by creator
I’m getting this shit from everywhere I’ve ever lived. I can normally ignore my phone for the most part, but I’m actually waiting on updates about the health of a relative. And this shit needs to stop. No one’s changing their mind cause of a text, guys. No one’s like, “Oh yeah, is it time to vote?” Please, just stop.
AR needs to kill the smartphone screen soon. Make my phone a keyboard and an external processor for my wearables.
So many? What kinda numbers are we talking here?
Shithole country.
I instantly heard it. DEEK
Yeah, using image recognition on a screenshot of the desktop and directing a mouse around the screen with coordinates is definitely an intermediate implementation. Open Interpreter, Shell-GPT, LLM-Shell, and DemandGen make a little more sense to me for anything that can currently be done from a CLI, but I’ve never actually tested em.
I was watching users test this out and am generally impressed. At one point, Claude tried to open Firefox, but it was not responding. So it killed the process from the console and restarted. A small thing, but not something I would have expected it to overcome this early. It’s clearly not ready for prime time (by their repeated warnings), but I’m happy to see these capabilities finally making it to a foundation model’s API. It’ll be interesting to see how much remains of GUIs (or high level programming languages for that matter) if/when AI can reliably translate common language to hardware behavior.
Can I blame Trump on 9/11 or something?
Just for fun, this can be accomplished with a poorly shielded speaker/audio cable next to a poorly shielded CRT/monitor cable displaying a locally run LLM. And it will make you feel like a hax0r