Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)N
Posts
1
Comments
481
Joined
3 yr. ago

  • I'm with our binary friend; the systems they try to replace tend to be time tested, reliable and simple (if not necessarily immediately obvious) to manage. I can think of a single instance where a Redhat-ism is better, or even equivalent, to what we already have. In eavh case it's been a pretty transparent attempt to move from Embrace to Extend, and that never ends well for the users.

  • It's amazing how many linux problems stem from 'Redhat, however, found this solution too simple and instead devised their own scheme'. Just about every over complex, bloated bit of nonsense we have to fight with has the same genesis.

  • I was more suggesting that it might be a bit eldritch, but sometimes humor doesn't come across quite right/

    The linked paper is focused on studying the 'perforation-type anchor' they use to hold the tissue to the mold as it grows, rather than keeping it alive afterwards. During growth the tissue and mold were submerged, or partially submerged, in a suitable medium to keep the cells healthy, and it was only when the resulting models were tested that they were removed (although one test did seem to involve letting it dry out to see if the anchors held). Growing the various layers of cells seems to be a solved problem, and I suspect that includes keeping them supplied with nutrients and such, so the authors aren't examining that. What's not solved is how to keep the tissue attached to a robot, which is what the authors were studying.

  • Do you really want to know? There are some things that the human mind is not meant to contemplate.

  • Let them colonize the sun. That'd show us no-good heathens who's boss, when we have to look up at their new home every day. Surely their god will keep them safe on such a glorious bit of missionary work.

  • This seems like a very complicated way to achieve your goal! It sounds like sitting yourself down and giving you a stern talking to might be a beter aporoach.

    Having said that, if you have these very important files that you don't want to lose, please make sure they're backed up somewhere off of your machine. Storage fails, and it's a horrible feeling losing something important. Unfortunately doing so would defeat the approach you're thinking of.

    This might be a case of needing to reframe the question to get to the cause of the issue, and then solve that. So, why do you want to make it hard to reinstall your machine? Is it the amount of time you spend on it, the chance of screwing it up, needing it working, has it become a compulsion or something else? Maybe if we can get to the root of the issue we can find a solution.

    With regard to TPM, it's basically just a key store, so you can use it fir anything really, althought it's normally used by generating a TPM key and using it to encrypt the key that's actually used to encrypt your data, storing the encrypted key with the OS. Just reinstalling won't wipe the TPM, but unless you made an effort to save the encrypted key it'll be gone. Given your problem statement above it just adds to the data you'd need to save, which isn't helpful.

  • Yes, the hypothetical posed does reveal more about the human mind, as I mention in another comment, really it's just a thought experiment as to whether the concept of an entity that doesn't (yet) exist can change our behavior in the present. It bears similarities to Pascal's Wager in considering an action, or inaction, that would displease a potential powerful entity that we don't know to exist. The nits about extracting your consciousness are just framing, and not something to consider literally.

    Basically, is it rational to make a sacrifice now avoid a massive penalty (eternal torture/not getting into heaven) that might be imposed by an entity you either don't know to exist, or that you think might come into existence but isn't now?

  • I think the concept is that the AI is just so powerful that humans can't use it, it uses them, theoretically for their own benefit. However, yes, I agree people would just try to use it to be awful to each other.

    Really it's just a thought experiment as to whether the concept of an entity that doesn't (yet) exist can change our behavior in the present.

  • I'm not suggesting it could, or would, happen, merely pointing out the premise of the concept as outlined by Roko as I felt the commenter above was missing that. As I said, it's not something I'd take seriously, it's just a thought experiment.

  • Whilst I agree that it's definitely not something to be taken seriously, I think you've missed the point and magnitude of the prospective punishment. As you say, current groups already punish those who did not aid their assent, but that punishment is finite, even if fatal. The prospective AI punishment would be to have your consciousness 'moved' to an artificial environment and tortured for ever. The point being not to punish people, but to provide an incentive to bring the AI into existence sooner, so it can achieve its 'altruistic' goals faster. Basically, if the AI does come in to existence, you'd better be on the team making that happen as soon as possible, or you'll be tortured forever.

  • Ok, I'm still not clear on exactly what you're trying to achieve as I can't quite see the connection between somehow preventing certain files being duplicated when cloning the disk and preventing yourself from reinstalling the system.

    Bear in mind that reinstalling the system would replace all of the OS, so there's no way to leave counter-measures there, and the disk itself can't do anything to your data, even if it could detect a clone operation.

    If what you're trying to protect against is someone who knows everything you do accessing your data, you could look to use TPM to store the encryption key for your FDE. That way you don't know the password, it's stored encrypted with a secret key that is, in turn, stored and protected by your CPU. That way a disk clone couldn't be used on any hardware except your specific machine.

  • Nothing can prevent a disk clone cloning the data, and there's no way to make something happen when a disk is cloned as you're not in control of the process.

    If you wish to mask the existence of the files, use either full disk encryption, in which case cloning the disk doesn't reveal the existence of the files without the decrypt password, or use a file based encrypted partition such as veracrypt in which case the cloner would just see a single encrypted blob rather than your file names.

    Ultimately encrypting the files with gpg means they have already effectively 'destroyed or corrupted' themselves when cloned. If you don't want to reveal the filenames, just call them something else.

    If you could be a bit more specific about your threat model people may have better ideas to help.

  • I don't think foxes typically go for cats around here, and, as far as I'm aware, not much eats them either. We don't have any of the larger predators that might kill them just to remove competition either, so I suppose foxes are apex predators here too.

    On the other hand, I can see either a cat or a fox being a tasty morsel for a bear, so tge whole apex/meso distinction is certainly location dependent.

  • Thanks a lot, I just sprained my brain trying to make sense of that.

  • It sounds like you're actually more concerned about the data in the files not being able to 'pop up' elsewhere, rather than the files themselves. In thus case I'd suggest simply encrypting them, probably using gpg. That'll let you set a password that is distinct from the one used for sudo or similar.

    You should also be using full disk encryption to reduce the risk of a temporary file being exposed, or even overwritten sectors/pages being available to an attacker.

  • I think that might be geographic dependent, for instance there's nothing around here that would predate cats, which would suggest they are, locally at least, apex by default.

    There aren't many wildcats left here though, maybe if there were we'd see larger predators move in and push cats down the food chain, so I can see the mesopredator argument.

  • It looks like AssDB uses a weird SQL syntax? Is it worth upgrading to, I hear it's great at pulling information out of unstructured and even imaginary data sources?

  • You've taken an apex predator, evolved for the stresses of the tooth and claw natural world, fulfilled their every need and whim, and now all they have left is choir practice and occasional surprise attacks on unwary feet.

  • Ah, memories. That was me on a Spectrum. It's all fun and games until you forget to save (to tape) and your code hangs the machine, losing everything.