Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)E
Posts
8
Comments
243
Joined
1 yr. ago

  • Hrm, the pre-commit issue is still open.

    Like the others in that thread, I'm not married to pre-commit or the check happening before the commit as opposed to the push, I just want to have some easy-to-setup, standardized way of preventing myself from pushing stuff that will be rejected by CI.

  • It's ultimately good news, but the framing is bizarre.

    Who criticises global warming? Well, people like us, and pope Leo. As opposed to people who'd rather criticise us and claim that global warming is no biggie (or even not happening).

    Similarly, who minimises climate change? Well, people who are actually doing something about it. People who are switching away from burning fossil fuels and taking other steps to minimise the impact of not only themselves, but others, by working in fields like renewable energy, transit, heat pumps, etc.

    Even the other framing of "minimising the impact of climate change" means working with adaptation strategies.

    I can only assume the framing is so weird because of choices the BBC made.

  • One more puzzle piece here is that du won't report on files that have been marked for deletion but are still held on to by some process. There's an lsof incantation to list those, but I can't recall it off the top of my head.

    It used to be part of sysadmin work to detect the processes that held on to large files if df reports that you're running out of space, and restart them to make them let go of the file. But I haven't done that in ages. And if you restarted the host OS that should have taken care of that.

    I assume you also know how to prune container resources.

  • As always for us across the pond: Simply fascinating that his party isn't polling in the single digits, or even decimals

  • uutils is still busy playing catch-up to gnu coreutils though, so unclear how much competition in terms of features they provide

  • Nope! I tend mostly to use /org/repo and other subpages. The few times I find myself on / I'm just confused at how I wound up there and close the tab.

  • fwiw if you do a cargo build you should be able to see the error messages in the correct context. If I replicate line 25 in a little test project and run cargo build I get

     
        
    error: expected one of `.`, `;`, `?`, `else`, or an operator, found `{`
     --> src/main.rs:4:43
      |
    4 |     let guess: u32 = guess.trim().parse() {
      |                                           ^ expected one of `.`, `;`, `?`, `else`, or an operator
    
    error: could not compile `unacceptable-rs` (bin "unacceptable-rs") due to 1 previous error
    
      

    If I try this with a blank helix config I don't get any of the text output from rust-analyzer at all, just the three dots indicating there's a problem there, so it's unlikely it's a bad design choice on helix's part.

  • You're missing a match after the = and before guess… on line 25.

    The multiple statements on 37, 38, 39 after => also need to be enclosed in a {}.

    Also, why is your error message all the way up on the top, far away from the error? Something seems misconfigured.

  • It's even a tape archiving tool. Just pretty much nobody uses it in the original way any more.

    Very much one of those "if it ain't broke, don't replace it" tools.

  • Yeah, there should be a clear separation between scripts, which should have a shebang, and interactive use.

    If a script starts acting oddly after someone does a chsh, then that script is broken. Hopefully people don't actually distribute broken script files that have some implicit dependency on an unspecified interpreter in this day and age.

  • That’s interesting I hadn’t thought about the JSON angle! Do you mean that you can actually use jq on regular command outputs like ls -l?

    No, you need to be using a tool which has json output as an option. These are becoming more common, but I think still rare among the GNU coreutils. ls output especially is unparseable, as in, there are tons of resources telling people not to do it because it's pretty much guaranteed to break.

  • I've been using fish (with starship for prompt) for like a year I think, after having had a self-built zsh setup for … I don't know how long.

    I'm capable of using awk but in a very simple way; I generally prefer being able to use jq. IMO both awk and perl are sort of remnants of the age before JSON became the standard text-based structured data format. We used to have to write a lot of dinky little regex-based parsers in Perl to extract data. These days we likely get JSON and can operate on actual data structures.

    I tried nu very briefly but I'm just too used to POSIX-ish shells to bother switching to another model. For scripting I'll use #!/bin/bash with set -eou pipefail but very quickly switch to Python if it looks like it's going to have any sort of serious logic.

    My impression is that there's likely more of us that'd like a less wibbly-wobbly, better shell language for scripting purposes, but that efforts into designing such a language very quickly goes in the direction of nu and oil and whatnot.

  • People with aphantasia: You have no power here!

  • Depends on what's actually in the trade deal.

    But yeah, hopefully we can tax the yank tanks out of Europe if we can't ban them outright.

  • Isn’t that just nitpicking?

    No, because the definitions are phrased very differently. Software doesn't have to be copyleft to be considered FOSS either, as is the case with tons of BSD and MIT and whatnot code that's used in proprietary programs—all they have to do is make it clear that they're using their software (and even that's not a given).

    Even with copyleft licenses like the GPL, as long as they never distribute their software to anyone they don't have to offer them the source code either, as with so many backends. The AGPL gives consumers of distributed systems some more rights.

    Free software is mostly about providing you rights when you encounter the source code, meaning that you're allowed to modify it and share it. This is as opposed to stuff like "source available" licenses that permit you to read the source code, but not modify or share it.

  • Such a license would neither be regarded as free software nor open source.

    Some other alternative could be making GPL-3.0-or-later + a Contributor License Agreement a more common option, so that it is possible to tell companies that if they want to use the library in some closed-source application, they need to work out a license deal.

    CLAs are frequently involved in turning software proprietary though, so it isn't exactly held in the highest esteem in the FOSS community.

    And without a CLA you essentially get the Linux kernel situation, which will be stuck on GPL2 forever, since they can't reasonably get everyone to agree to switch to GPL3, especially since some copyright holders are not just unwilling, but unreachable or dead (and in several jurisdictions copyright lasts for decades after death).

    Personally I suspect public funding, similar to science, education and libraries, is a more likely option, though that'll be an uphill political struggle a lot of places.

  • Yep. I wonder if that CRA compliance stuff won't change that. Industries with strict demands on safety should be putting in work and resources to ensure that those demands are actually met, but how the CRA deals with FOSS took a bit of work to not be a complete disaster, and I can't imagine it's easy for FOSS projects to work out the details there.

    As in:

    1. The automotive industry absolutely should be CRA compliant,
    2. it'd be nice for everyone if cURL was known to be CRA compliant,
    3. compliance doesn't appear by magic, someone has to put in work,
    4. companies that should be CRA compliant should help with that work.

    In the case where they don't want to pitch in, well, something cURL-equivalent but known CRA-compliant won't just fall off the back of a wagon, which means the companies that need compliance have a problem.

    Then again, apparently the HPE Nonstop ecosystem has git available on their platform all through the spare-time efforts of all of one dude, which absolutely shows that critical systems are willing to rely on precarious software, so I'm not gonna hold my breath.

  • Yeah, it's once again a case of a central piece of software in a very precarious situation, and businesses that aren't … quite mindful of the fact that they're making demands from someone they're not paying.

  • Possibly, but the article didn't seem to specify what the actual complaints were. Hopefully they've worked with some actual Apache to figure out some rebrand that's decent towards the Apache (and addresses the complaints), but also doesn't make the brand more different than it needs to be.