There's various metrics, like CQS, CRI or newer versions of it.
It's basically about how close the wavelength spectrum is compared to a black-body radiator given a color temperature (e.g. an incandescent lamp or the sun).
Though IME, the light quality of a real white LED is better than the mix of an RGB led. Also interesting: the cooler the LED is the higher the quality of the light.
No it's not entirely clearly defined. But a few elements appear in all of them: Anti- communism/socialism, dictator, far-right idiologies, nationalism...
I wouldn't say any if these are desired by leftism, or rather opposites.
Honest opinion, I don't think they would ever be the bad guys, only a dead or non-existent Nazi is a good Nazi.
But I guess the unfortunate thing is that ICE would group together and an actual civil war starts (in which ICE likely has the upper hand, as it gets funded by the government and Nazis are more likely to have firearms I guess), which is probably exactly what the Trump Administration wants, I don't know it's a pretty bad situation there...
I'm using a Kaffeelogic Nano 7 sample roaster, which is quite simple to use and produces consistent results. I actually think almost all of my roasts were at least as good as high quality roasts I get locally.
That’s an absolute shame, because there’s tons and tons of cool coffee shops absolutely all over the place doing really cool, interesting, imaginative, and downright tasty things with coffee that you’re missing out on.
Maybe not around here (it's not the biggest city though), I think I tasted every worthwhile coffee in the city so far. Some are ok, but nothing that really stands out.
It's also more meant figuratively (though there's still some truth... after habituation on good coffee, previously ok-coffee is now bad... so I got really picky over the time of my coffee-nerd-career)
Starbucks coffee isn’t really intended to be enjoyed straight, it’s supposed to be made into milk drinks where the dairy, syrups, and toppings provide most of the flavor, and for that use case, it’s adequate.
Yeah it's americans perversion of coffee. It's more like soft-drinks with coffee-taste or something like that...
Yeah they roast way too dark, probably to hide the cheap coffee they use and possibly because their extraction is shit.
I can't drink coffee anywhere else anymore, since I'm roasting myself, and perfected extraction with a Cafelat Robot (low pressure, which I think works better with lighter roasts).
because the massive ecosystem of JS components makes you more productive.
Slightly less ironic: I question even this right now (as I have to suffer from endless "hot"-reloading and browser-crashes because of Next.js bloat).
I think the massive ecosystem has fewer high quality libraries than Rust at this point.
I use both JS/TS in frontend and Rust (either frontend more as a hobby and backend) extensively, and I very often check the dependencies-source, and even more often rewrite it (unfortunately not in Rust), because of low-quality.
And it's sooo slow... the tooling and the frontend (albeit I think that has a very lot to do with next.js... and with how easy it is to make it slow for someone not that experienced or someone not being extremely careful).
Frontend is not yet as matured as JS/TS (whatever matured is, but the count of frontend frameworks is at least a magnitude higher in JS/TS), but I think when I would start a new company I would default to Rust now as frontend indeed, the language itself is for me reason. And I think vanilla-js (or Rust?) is not that much worse (time/effort-wise, sanity etc.) for more complex applications than what the Next.js ecosystem has produced so far.
Actually, my (not that small) Rust projects now take officially less time to cold compile than the "hot" reloading of our next.js monster in my job. Incremental compilation is at least an order of magnitude faster. And cherry on top, dumb code is often 100x faster than js.
You can't imagine how often I just sweared today about js. What did go through the mind of their designers, when they created this growing disease, and why did web browsers accept this as the lingua franca for the web. So... much... pain...
Definitely not your average Rust code, more like a very ugly example of it.
Also, as the syntax first put me off as well, I gave it a chance years afterwards, and have now (or rather years ago) officially joined the church of Rust evangelism.
A lot of the syntax you define as ugly makes sense when you learn it, it's just so much more explicit than a more dynamic language, but that exactly saves your ass a lot (it did for me at the very least) (I don't mean macros, macros are ugly and should be avoided if possible)
Yeah it takes more time than a quick and dirty python script. But when I'm counting the countless hours (what irony) into this equation because of mindless leaky abstractions and resulting debugging, I'm certain that I'm at least not a lot slower writing that.
As I said I'm not talking about the last 10-20% of performance that's possible say even up to 40%, but more like an order of magnitude (at least), i.e. algorithmically insufficient or relying too much on that your abstractions do everything right and you use it correctly (which in the case of react is seemingly not the case, when looking at the modern web).
Taking that example (Rust) again, I very often get away with .clone() everywhere, i.e. not even caring much about performance while the performance is not significantly impacted.
Then I switch to our typescript code-base in my job and get aggressions because of this extreme slowness (because of stupid abstractceptions, like wtf? shadcn needs to be built on radix-ui needs to be built on react etc. which in effect results in a slow abstraction-hell... and leaky abstractions everywhere)
Non ironically: In practice it mostly boils down to experience, writing relatively efficient software should not take much more time or even long term accelerate development (less time to wait) (I don't talk about the last few percent of compiler reverse-engineered SIMD optimisation that takes time...)
I detest the state modern web development has downspiraled to.
I bet I'm faster writing a big application in Vanilla js vs using the abomination that Next.Js has come to...
The problem though (with AI compared to humans): The human team learns, i.e. at some point they probably know what the mistake was and avoids doing it again.
AI instead of humans: well maybe the next or different model will fix it maybe...
And what is very clear to me after trying to use these models, the larger the code-base the worse the AI gets, to the point of not helping at all or even being destructive.
Apart from dissecting small isolatable pieces of independent code (i.e. keep the context small for the AI).
Humans likely get slower with a larger code-base, but they (usually) don't arrive at a point where they can't progress any further.
Yes, and blending in between, including RGB, can enhance the quality of light as well to estimate a natural light source.