Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)C
Posts
1
Comments
96
Joined
3 yr. ago

  • BadRAM specifiers can apply to stripes of memory corresponding to certain physical hardware failures. The memmap hack only allows for contiguous allocations. BadRAM's intended for repurposing consumer-grade RAM that might normally be thrown out, not for reconfiguring motherboards that have strange layouts.

  • I think that there are two pieces to it. There's tradition, of course, but I don't think that that's a motive. Also, some folks will argue that not taking hands off the keyboard, not going to a mouse, is an advantage; I'm genuinely not sure about that. Finally, I happen to have decent touch typing; this test tells me 87 WPM @ 96% accuracy.

    First, I don't spend that much time at the text editor. Most of my time is either at a whiteboard, synchronizing designs and communicating with coworkers, or reading docs. I'd estimate that maybe 10-20% of my time is editing text. Moreover, when I'm writing docs or prose, I don't need IDE features at all; at those times, I enable vim's spell check and punch the keys, and I'd like my text editor to not get in the way. In general, I think of programming as Naur's theory-building process, and I value my understanding of the system (or my user's understanding, etc.) over any computer-rendered view of the system.

    Second, when I am editing text, I have a planned series of changes that I want to make. Both Emacs and vim descend from lineages of editors (TECO and ed respectively) which are built out of primitive operations on text buffers. Both editors allow macro-instructions, today called macros, which are programmable sequences of primitive operations. In vim, actions like reflowing a paragraph (gqap) or deleting everything up to the next semicolon and switching to insert mode (ct;) are actually sentences of a vim grammar which has its own verbs and nouns.

    As a concrete example, I'm currently hacking Linux kernel because I have some old patches that I am forward-porting. From the outside, my workflow looks like staring out the window for several minutes, opening vim and editing less than one line over the course of about twenty seconds, and restarting a kernel build. From the inside, I read the error message from the previous kernel build, jump to the indicated line in vim with g, and edit it to not have an error. Most of my time is spent legitimately slacking multitasking. This is how we bring up hardware for the initial boot and driver development too.

    Third! This isn't universal for Linux hackers. I make programming languages. Right now, I'm working with a Smalltalk-like syntax which compiles to execline. There's no IDE for execline and Smalltalks famously invented self-hosted IDEs, so there's no existing IDE which magically can assist me; I'd have to create my own IDE. With vim, I can easily reuse existing execline and Smalltalk syntax highlighting, which is all I really want for code legibility. This lets me put most of my time where it should go: thinking about possibilities and what could be done next.

  • So, you've never known any Unix hackers? I worked for a student datacenter when I was at university, and we were mostly vim users; as far as text-editor diversity, we did have one guy who was into emacs and another who preferred nano. After that, I went to work at Google, where I continued to use vim. As far as fancy IDE features, I do use syntax highlighting and I know how to use the spell checker but I don't use autocomplete. I've heard of neovim but don't have a good reason to try it out yet; maybe next decade?

  • Hi! You are bullshitting us. To understand your own incorrectness, please consider what a chatbot should give as an answer to the following questions which I gave previously, on Lobsters:

    • Is the continuum hypothesis true?
    • Is the Goldbach conjecture true?
    • Is NP contained in P?
    • Which of Impagliazzo's Five Worlds do we inhabit?

    The biggest questions in mathematics do not fit nicely into the chatbot paradigm and demonstrate that LLMs lack intelligence (whatever that is). I wrote about Somebody Else's Paper, but it applies to you too:

    This attempt doesn't quite get over the epistemological issue that something can be true or false, determined and decided, prior to human society learning about it and incorporating it into training data.

    Also, on a personal note, I recommend taking a writing course and organizing your thoughts prior to writing long posts for other people. Your writing voice is not really yours, but borrowed from chatbots; I suspect that you're about halfway down the path that I described previously, on Lobsters. This is reversible but you have to care about yourself.

  • Secondarily, you are the first person to give me a solid reason as to why the current paradigm is unworkable. Despite my mediocre recall I have spent most of my life studying AI well before all this LLM stuff, so I like to think I was at least well educated on the topic at one point.

    Unfortunately it seems that your education was missing the foundations of deep learning. PAC learning is the current meta-framework, it's been around for about four decades, and at its core is the idea that even the best learners are not guaranteed to learn the solution to a hard problem.

    I am somewhat curious about what architecture changes need to be made to allow for actual problem solving.

    First, convince us that humans are actual problem solvers. The question is begged; we want computers to be intelligent but we didn't check whether humans were intelligent before deciding that we would learn intelligence from human-generated data.

  • You always need to read what the machine generated for you; the machine can only write code for you, not understand code for you. Here, the biggest issue is that copy might not work if the input and output containers are different, if the input has multiple framerates or audio tracks, etc.

  • Well done. I recently revived the BadRAM kernel patch in order to do something similar; memtest86+ supports that functionality too, using

    <F4>

    ,

    <F4>

    ,

    <F10>

    ,

    <F10>

    .

  • Your analogy is bogus because this is the Fediverse and we can defederate from tankies without giving them money. The entire topic revolves around how Framework spends money. Whataboutism in this context is a classic defense of fascism, for what it's worth.

  • C also sucks. Also, stop misgendering yourself; when you respect yourself more, you'll respect others more, and then you'll stop saying that people are cancer.

  • Weren't you taught not to use dehumanizing language when you were a child?

  • I want you to write kernel code for a few years. But we go to Lemmy with the machismo we have, not the machismo we wish we had. Write a JSON recognizer; it should have the following signature and correctly recognize ECMA 404, returning 0 on success and 1 on failure.

     
            int recognizeJSON(const char*);
    
    
      

    I estimate that this should take you about 120 lines of code. My prior estimated defect rate for C programs is about one per 60 lines. So, to get under par, your code should have fewer than two bugs.

  • But if you had read the article and attached links then you would have learned that the particular issue under discussion and source of other issues is from Project Big Sleep, which focuses on using generative tooling to confabulate issues. You would also see for yourself that the reported issues are in C.

  • They had you right the first time. You have a horde of accounts and your main approach is to post Somebody Else's Opinion for engagement. You have roughly the political sophistication of a cornstalk and you don't read the articles that you submit. You don't engage on anything you've posted except to defend your style of posting. There's no indication that you produce Free Software. You use Lemmy like Ghislane Maxwell used Reddit.

  • RPython, the toolchain which is used to build JIT compilers like PyPy, supports Windows and non-Windows interpretations of standard Python int. This leads to an entire module's worth of specialized arithmetic. In RPython, the usual approach to handling the size of ints is to immediately stop worrying about it and let the compiler tell you if you got it wrong; an int will have at least seven-ish bits but anything more is platform-specific. This is one of the few systems I've used where I have to cast from an int to an int because the compiler can't prove that the ints are the same size and might need a runtime cast, but it can't tell me whether it does need the runtime cast.

    Of course, I don't expect you to accept this example, given what a whiner you've been down-thread, but at least you can't claim that nobody showed you anything.

  • Java is bad but object-based message-passing environments are good. Classes are bad, prototypes are also bad, and mixins are unsound. That all said, you've not understood SOLID yet! S and O say that just because one class is Turing-complete (with general recursion, calling itself) does not mean that one class is the optimal design; they can be seen as opinions rather than hard rules. L is literally a theorem of any non-shitty type system; the fact that it fails in Java should be seen as a fault of Java. I is merely the idea that a class doesn't have to implement every interface or be coercible to any type; that is, there can be non-printable non-callable non-serializable objects. Finally, D is merely a consequence of objects not being functions; when we want to apply a functionf to a value x but both are actually objects, both f.call(x) and x.getCalled(f) open a new stack frame with f and x local, and all of the details are encapsulation details.

    So, 40%, maybe? S really is not that unreasonable on its own; it reminds me of a classic movie moment from "Meet the Parents" about how a suitcase manufacturer may have produced more than one suitcase. We do intend to allocate more than one object in the course of operating the system! But also it perhaps goes too far in encouraging folks to break up objects that are fine as-is. O makes a lot of sense from the perspective that code is sometimes write-once immutable such that a new version of a package can add new classes to a system but cannot change existing classes. Outside of that perspective, it's not at all helpful, because sometimes it really does make sense to refactor a codebase in order to more efficiently use some improved interface.

  • Look, just because you don't click bluelinks doesn't imply that anybody using them is a bot. Sometimes Wikipedia really does have useful information. If you don't want to get talked to in a condescending manner, don't reply to top-level posts with JAQs or sealions.

  • Y'know, knowing that you live in DACH, I can't help but read this as sour grapes: if only you were allowed to be more fascist, but those mean old online communists just won't let you!

  • Given that I've never seen you in the Ruby, Rails, or Sinatra communities, I'm going to guess that you aren't actually part of this conversation. Also, you've been fairly obvious in your cryptofascism since this Lemmy instance was set up; you're one of several users that have ensured that programming.dev has a fairly bad federated reputation, and I'm not sure that anybody really cares whether you're included given that you don't appear to publish Free Software or anything else useful.

  • Weird way to say that you haven't heard of yinglets.

  • Programmer Humor @programming.dev
    Locked

    Positive Affirmations for Site Reliability Engineers