If you are writing a parser in haskell just use Happy and get it over with
- Posts
- 19
- Comments
- 377
- Joined
- 3 yr. ago
- Posts
- 19
- Comments
- 377
- Joined
- 3 yr. ago
What a coincidence, I saw that for the first time this Friday
Apathetic omelette
- JumpDeleted
Permanently Deleted
I experience something similar on a vega56, but it doesn't happen on generic high loads, it happens only
- in some specific loads (for me it was mainly A Plague Tale: Requiem)
- when something touches the grounding of my monitor (the display port of the GPU is faulty, it seems)
I like many of your points, but your comment is facetious.
You said it yourself, "it's good for someone trying to bang out scripts"... and that's it, that's the main point, that's the purpose of python. I will argue over my dead body that python is a trillion times better than sh/bash/zsh/fish/bat/powershell/whatever for writing scripts in all aspects except availability and if that's a concern, the only options are the old Unix shell and bat (even with powershell you never know if you are stuck ps 5 or can use ps 7).
I have a python script running 24/7 on a raspberry that listens on some mqtt topics and reacts accordingly asynchronously. It uses like 15kiB (literally less than 4 pages) of ram mostly for the interpreter, and it's plenty responsive. It uses about two minutes of CPU time a day. I could have written it in rust or go, I know enough of both to do it, it would have been faster and more efficient, but it would have taken three times the time to write, and it would have been a bitch to modify, I could have done it in C and it would have been even worse. For that little extra efficiency it makes no sense.
You argue it has no place in mainstream software, but that's not really a matter of python, more a matter of bad software engineers. Ok, cool that you recognise the issue, but I'd rather you went after the million people shipping a full browser in every GUI application, than to the guys wasting 10 kiB of your ram to run python. And even in that case, it's not an issue of JavaScript, but an issue of bad practices.
P.S. "does one thing well" is a smokescreen to hide doing less stuff, you shouldn't base your whole design philosophy on a quote from the 70s. That is the kind of shit SystemD hater shout, while running a display server that also manages input, opengl, a widget toolkit, remote desktop, and the entire printer stack. The more a high profile tool does, the less your janky glue code scripts need to do.
I'll be honest, I think modern python is cool. You just need to accept that it has some limitations by design, but they mostly makes sense for its purpose.
It's true that the type system is optional, but it gets more and more expressive with every version, it's honestly quite cool. I wish Pylance were a bit smarter though, it sometimes fails to infer sum types in if-else statements.
After a couple large-ish personal projects I have concluded that the problem of python isn't the language, but the users.
On the other hand, C's design is barren. Sure, it works, it does the thing, it gives you very low level control. But there is nothing of note in the design, if not some quirks of the specifications. Being devoid of innovation is its strength and weakness.
What a coincidence, I just read that watchmen chapter yesterday
In this context "weight" is a mathematical term. Have you ever heard the term "weighted average"? Basically it means calculating an average where some elements are more "influent/important" than others, the number that indicates the importance of an element is called a weight.
One oversimplification of how any neural network work could be this:
- The NN receives some values in input
- The NN calculates many weighted averages from those values. Each average uses a different list of weights.
- The NN does a simple special operation on each average. It's not important what the operation actually is, but it must be there. Without this, every NN would be a single layer. It can be anything except sums and multiplications
- The modified averages are the input values for the next layer.
- Each layer has different lists of weights.
- In reality this is all done using some mathematical and computational tricks, but the basic idea is the same.
Training an AI means finding the weights that give the best results, and thus, for an AI to be open-source, we need both the weights and the training code that generated them.
Personally, I feel that we should also have the original training data itself to call it open source, not just weights and code.
I would like to interject for a moment. This statement is technically true but disingenuous and facetious.
While it's true that Linux is just the kernel, what most people refer to as Linux is actually the Operating System GNU/Linux, or, as RMS would now call it, GNU plus Linux, or sometimes, a less GNU depended, but mostly GNU/Linux compatible OS, or, as I have literally just now come to call it */Linux.
Moreover, a modern */Linux system is expected to be based on SystemD, unless explicitly avoiding it due to some technical constraint or some desired feature of another init system. One could come to call this SystemD/Linux.
And lastly, this kind of use case would be the perfect match for a Wayland shell, as opposed to an X11 shell. Which would be more efficient, and would give the shell more freedom in the management of windows.
As a result, when asking about a Linux phone, we could expect one is talking about a phone running a SystemD+Wayland/Linux OS, or at least a mobile-focused */Linux OS.
The Android kernel is a, largely downstream, fork of the Linux kernel, but the Android OS is in almost no way compatible with any */Linux OS, and it's instead its own completely different OS.
Maybe we can get Disney to copyright this company into oblivion 🤔
What the fuck is this bullshit? This is literally out of a Scrooge McDuck story. Not even a joke, I literally have it on paper
Edit: here's a page
What episode is that?
"Next time" might mean December, so don't let your guard down, for now. But it really was a great success
Dunno about you, but my opinion has recently helped to successfully repel mass surveillance for my country and others. The fight is not over yet, but this battle was won.
The server in question is a raspberry with 4 gigabytes of ram, so I will need to use containers very sparingly. Basically I'm using podman quadlets only for those services that really only comes in containers (which for now means only codimd, overleaf, and zigbee2mqtt), and I'm running everything else on metal. But even with containers, I would still need to manage container configurations, network, firewall, file sharing permissions, etc. just like I did without containers.
Nope, Alan Wake is voiced by Matthew Porretta (Darling) but acted by Ilkka Villi. Just like Alex Casey and Max Payne are acted by Sam Lake (the creative director of Remedy) but voiced by James McCaffrey (Director Trench). I think this was because both Lake and Villi have really strong accents
There are two DLCs, Foundation and AWE, did you get both? I think AWE is good, but Foundation is really great
First of all... Always follow the distribution's instructions for this kind of thing. Second, I suppose you installed the proprietary drivers, but your GPU is a 30xx so you should really get the open drivers instead. There's a paragraph on that in the page you linked, try that and see if it works.