Skip Navigation

Posts
2
Comments
378
Joined
2 yr. ago

  • We've found it to be the "least bad option" for DnD. Have a Discord window open for everyone to video chat in, have a browser window open with Owlbear Rodeo or Foundry / Forge for your tokens and character sheets, all works smoothly enough. The text chat is sufficient for sending the DM a private message; for group chat to share art of the things you've just run into or organise the next session.

    Completely agree that for anything "less transient", then the UX is beyond awful and trying to find anything historical is a massive PITA.

  • Yeah, I'm with you there - worked for twenty years in water treatment myself. Water before it's been chlorinated / chloraminated for supply? Makes the best cups of tea and coffee ever - you need to boil it, of course. RO water? Vile.

  • The joke about adding well water back in again at the end is "correct". Reverse osmosis removes 100% of the solids from the water, but drinking water usually contains small quantities of solids - you can see a breakdown on the label of some bottled water. Completely pure water would leach all of the solids that have built up on the insides of water pipes over the decades, and leaches away the protective oxide layer from metal pipework, causing it to corrode surprisingly rapidly. It also tastes pretty shitty - kind of "dead". So a small amount of high-solids water is mixed back in after RO to bring the water back to normal levels.

    All that other shit in the diagram? No. Purification and treatment takes place after the mixing step, it would be crazy not to.

  • Removed

    Emacs.ch (Mastodon Instance for the Emacs community) will shut down.

    Jump
  • Should have used Vim instead, that's a real text editor. No-one who starts using it ever moves on to something else.

  • Dark Souls' implementation is something special. Censors your name based on the language settings you have in place at the time, voice-over dialogue remains in English. So change your system language to either another language you know, or play it a few times so you know what things are, and then put the most offensive shit in as your character name you like.

  • Rule

    Jump
  • It's been a perpetual source of surprise to me that curry houses are so 'non-specific'. Pakistan and India together make about 1.7 billion people, about a third of the planet's population, and I'd have thought an easy way to distinguish a restaurant would be to offer something more region-specific, but it's fairly rare.

    Here in the UK, the majority of curry houses are Bangladeshi - used to be the vast majority, now it's more like 2/3rds. We've a couple of 'more specific' chains - both Bundobust and Dishoom do Mumbai-style, and they're both fantastic - and there's a few places that do well with the 'naturally vegan' cuisines, but mostly you can go in to a restaurant and expect the usual suspects will be on the menu.

    Same goes for Chinese restaurants - I don't believe that a billion people all eat the same food, it's too big a place for the same ingredients to be in season all the time. Why are they not more specific, more often?

  • Indeed. Here in the UK, people can request that their water company should add it in if their water supply is low-fluoride, for instance from a reservoir, and the water company must add it in.

    Back when I used to work in water, that was always the stuff that gave me nightmares. Concentrated hexafluorosilicic acid is what we'd use for dosing. We'd test all the equipment in the chemical room on plain water, drain it out and then literally brick up the doorway. Site would be evacuated during delivery - delivery guy would connect everything up in a space suit, hop in the shower afterwards. Lasted for ages and ages, since you only need the tiniest drip in the water supply to get what you need, but the tiniest drip on your skin would be enough to kill you as well; its lethal dosage is horrifically small.

    Made working with all the other halides much less of a concern - we use shed loads of chlorine, but that stuff is much much less nasty in comparison.

  • Yeah, it's always had really strong art direction - still holds up, and you don't notice missing shadows so much in the middle of a frenetic sequence anyway.

    Good to see ray tracing coming along. You could get the same shadows and lighting in a modern rasterising engine now as demonstrated in the RTX version, but at the cost of much more development time. Graphics like that being available to smaller studios and larger games being feasible for bigger studios would be great. HL2 is massive compared to modern shooters, and not having to spend forever tweaking each scene helps with that.

  • When I was still dual-booting Windows and Linux, I found that "raw disk" mode virtual machines worked wonders. I used VirtualBox, so you'd want a guide somewhat like this: https://superuser.com/questions/495025/use-physical-harddisk-in-virtual-box - other VM solutions are available, which don't require you to accept an agreement with Oracle.

    Essentially, rather than setting aside a file on disk as your VM's disk, you can set aside a whole existing disk. That can be a disk that already has Windows installed on it, it doesn't erase what you have. Then you can start Windows in a VM and let it do its updates - since it can't see the bootloader from within the VM, it can't fuck it up. You can run any software that doesn't have particularly high graphics requirement, too.

    I was also able to just "restart in Windows" if I wanted full performance for a game or something like that, but since Linux has gotten very good indeed at running games, that became less and less necessary until one day I just erased my Windows partition to recover the space.

  • I'd suggest that we do not. How about we split the difference, and drop them off halfway between Belfast and Stranraer, say?

  • Now, for the best battery health, just need to only charge everything that's used portably but plugged in every night to 80%, and everything that's occasionally moved from place to place but only ever used when it's plugged in to 50%.

    100% charges are for those occasions when you'll be working away from power for a few days.

  • As an example of a language that many people are familiar with, which is likely to be in long-term use where maintainability is most important, and which can almost read like pseudocode anyway, sure - probably the best 'real language' choice.

  • You can write an unmaintainable fucking mess in any language. Rust won't save you from cryptic variable naming, copy-paste code, a complete absence of design patterns, dreadful algorithms, large classes of security issues, unfathomable UX, or a hundred other things. "Clean code" is (mostly) a separate issue from choice of language.

    Don't get me wrong - I don't like this book. It manages to be both long-winded and facile at the same time. A lot of people seem to read it and take the exact wrong lessons about maintainability from it. I think that it would mostly benefit from being written in pseudocode - concentrating on any particular language might distract from the message. But having a few examples of what a shitfest looks like in a few specific languages might help

  • My old job had a lot of embedded programming - hard real-time Z80 programming, for processors like Z800s and eZ80s to control industrial devices. Actually quite pleasant languages to do bit-twiddling in, and it's great to be able to step through the debugger and see that what the CPU is running is literally your source code, opcode by opcode.

    Back when a computers were very simple things - I'm thinking a ZX Spectrum, where you can read directly from the input ports and write directly into the framebuffer, no OS in your way just code, then assembly made a lot of sense, was even fun. On modem computers, it is not so fun:

    • x64 is just a fucking mess
    • you cannot just read and write what you want, the kernel won't let you. So you're going to be spending a lot of your time calling system routines.
    • 99% of your code will just be arranging data to suit the calling convention of your OS, and doing pointless busywork like stack pointer alignment. Writing some macros to do it for you makes your code look like C. Might as well just use C, in that case.

    Writing assembly makes some sense sometimes - required for embedded, you might be writing something very security conscious where timing is essential, or you might be lining up some data for vectorisation where higher-level languages don't have the constructs to get it right - but these are very small bits of code. You would be mad to consider "making the whole apple pie" in assembly.

  • London in particular has a very transient local population - a lot of people move there for a few years then move on. Wikipedia has the city at ~50% 'overseas born' at the moment - it's a very cosmopolitan place. So having about double the number of tourists as 'residents' isn't going to have the same cultural impact that it would in some of the other cities here

    I'm surprised that there's as many as 250k 'locals' in Venice, it was my understanding that they mostly live inland or up the coast and commute into the city.

  • That would be the 25mm2 stuff, about 9mm diameter. Pretty standard for electric ovens.

    The joy of producing electricity from renewables at 12Vdc is that you can run it straight into a whole bank of car and truck batteries for storage. Can then either use it directly for powering things - there's a lot of things like portable tellies for use in a caravan that are 12V for this reason - or feed it to an inverter to get 240Vac for 'normal' usage. Again, large outdoor stores will have them, because they're intended for this usage.

  • Cheaper for now, since venture capitalist cash is paying to keep those extremely expensive servers running. The AI experiments at my work (automatically generating documentation) have got about an 80% reject rate - sometimes they're not right, sometimes they're not even wrong - and it's not really an improvement on time having to review it all versus just doing the work.

    No doubt there are places where AI makes sense; a lot of those places seem to be in enhancing the output of someone who is already very skilled. So let's see how "cheaper" works out.

  • PS3 most certainly had a separate GPU - was based on the GeForce 7800GTX. Console GPUs tend to be a little faster than their desktop equivalents, as they share the same memory. Rather than the CPU having to send eg. model updates across a bus to update what the GPU is going to draw in the next frame, it can change the values directly in the GPU memory. And of course, the CPU can read the GPU framebuffer and make tweaks to it - that's incredibly slow on desktop PCs, but console games can do things like tone mapping whenever they like, and it's been a big problem for the RPCS3 developers to make that kind of thing run quickly.

    The cell cores are a bit more like the 'tensor' cores that you'd get on an AI CPU than a full-blown CPU core. They can't speak to the RAM directly, just exchange data between themselves - the CPU needs to copy data in and out of them in order to get things in and out, and also to schedule any jobs that must run on them, they can't do it themselves. They're also a lot more limited in what they can do than a main CPU core, but they are very very fast at what they can do.

    If you are doing the kind of calculations where you've a small amount of data that needs a lot of repetitive maths done on it, they're ideal. Bitcoin mining or crypto breaking for instance - set them up, let them go, check in on them occasionally. The main CPU acts as an orchestrator, keeping all the cell cores filled up with work to do and processing the end results. But if that's not what you're trying to do, then they're borderline useless, and that's a problem for the PS3, because most of its processing power is tied up in those cores.

    Some games have a somewhat predictable workload where offloading makes sense. Got some particle effects - some smoke where you need to do some complicated fluid-and-gravity simulations before copying the end result to the GPU? Maybe your main villain has a very dramatic cape that they like to twirl, and you need to run the simulation on that separately from everything else that you're doing? Problem is, working out what you can and can't offload is a massive pain in the ass; it requires a lot of developer time to optimise, when really you'd want the design team implementing that kind of thing; and slightly newer GPUs are a lot more programmable and can do the simpler versions of that kind of calculation both faster and much more in parallel.

    The Cell processor turned out to be an evolutionary dead end. The resources needed to work on it (expensive developer time) just didn't really make sense for a gaming machine. The things that it was better at, are things that it just wasn't quite good enough at - modern GPUs are Bitcoin monsters, far exceeding what the cell can do, and if you're really serious about crypto breaking then you probably have your own ASICs. Lots of identical, fast CPU cores are what developers want to work on - it's much easier to reason about.

  • Yes, because it doesn't do as much to protect you from data corruption.

    If you have a use case where a barely-measurable increase in speed is essential, but not so essential that you wouldn't just pay for more RAM to keep it in cache, and also it doesn't matter if you get the wrong answer because you've not noticed the disk is failing, and you can afford to lose everything in the case of a power cut, then sure, use a legacy filesystem. Otherwise, use a modern one.