One subtle diversion from this: I think the joke here is not that Everett opposes homelessness (and is generous regardless); the joke here is that he wants to encourage this fellow, and is actively fighting those who would discourage him.
It's just a little different perspective on why this is funny.
I'm pretty anti-AI but even I'll cop to this one. ChatGPT is good at figuring out what you're trying to describe. Know you need a particular networking concept? Describe it a bit to ChatGPT and ask for some concepts that are similar, and the thing you're looking for will probably be in the list.
Looking for a particular library that you assume must exist even though you've never seen it? ChatGPT can give you that.
You're on your own after that, but it can actually save you a bit of research time.
The problem is this: it's sure it has the answer 100% of the time, but about 30% of the time it gives you a list of nothing but wrong answers and you can go off in the wrong direction as a result.
Intent matters in criminal law and would be considered in future cases of this type. If someone is being arrested for violating the law, and the intent of the arrest is to prosecute legitimate criminal behavior, you're good. If it can be shown that the intent was political retaliation, you're in the shit.
It's actually safer if everyone knows. Spreading the knowledge of Roko's basilisk to everyone means that everyone is incentivized to contribute to the basilisk's advancement. Therefore just talking about it is also contributing.
I'd be willing to bet this genius maneuver drives it back up.
Yeah, looking more closely at that graph, I'm noticing it starts in 2009, when Greece had The Crisis: sovereign debt soared thanks to the housing bubble collapse, and people taking a closer look at the actual books of the Greek state. Austerity measures are what led to the massive unemployment spike, and this 6-day work week is another version of austerity.
Austerity doesn't work. This graph couldn't be clearer about that fact.
When we say LLMs don't know or understand anything, this is what we mean. This is a perfect example of an "AI" just not having any idea what it's doing.
I'll start with a bit of praise: It does do a fairly good job of decomposing the elements of Python and the actuary profession into bits that would be representative of those realms.
But:
In the text version of the response, there are already far too many elements for a good tattoo, demonstrating it doesn't understand tattoo design or even just design
In the drawn version, the design uses big blocks of color with no detail, which (even if they looked good on a white background; and they don't;) would look like shit inked on someone's skin. So again, no understand of tattoo art.
It produces a "simplified version" of the python logo. I assume those elements are the blue and yellow hexagons, which are at least the correct colors. But it doesn't understand that, for this to be PART OF THE SAME DESIGN, they must be visually connected, not just near each other. It also doesn't understand that the design is more like a plus; nor that the design is composed of two snakes; nor that the Python logo is ALREADY VERY SIMPLE, nor that the logo, lacking snakes, loses any meaning in its role of representing Python.
It says there's a briefcase and glasses in there. Maybe the brown rectangle? Or is the gray rectangle meant to be a briefcase lying on its side so the handle is visible? No understanding here of how humans process visual information, or what makes a visual representation recognizable to a human brain.
Math stuff can be very visually interesting. Lots of mathematical constructs have compelling visuals that go with them. A competent designer could even tie them into the Python stuff in a unified way; like, imagine a bar graph where the bars were snakes, twining around each other in a double helix. You got math, you got Python, you got data analysis. None of this ties together, or is even made to look good on its own. No understanding of what makes something interesting.
Everything is just randomly scattered. Once again, no understanding of what design is.
AIs do not understand anything. They just regurgitate in ways that the algorithm chooses. There's no attempt to make the algorithm right, or smart, or relevant, or anything except an algorithm that's just mashing up strings and vectors.
Why do I see so many of these accounts from reddthat?