Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)F
Posts
1
Comments
707
Joined
2 yr. ago

  • Yeah I'm watching Ty. Pytype and Pyre are not serious options. Nobody really uses them, and Pytype is discontinued. Facebook have a new project called Pyrefly that's also worth watching.

    But for now, use Pyright. No argument. If you're really worried about Microsoft (and not Facebook or Google for some reason) then use BasedPyright.

  • As long as they don't remove it from the IoT LTSC edition I don't care.

  • I would say:

    1. Just practice, do projects. Also if you can work on projects with other people because you'll read a lot of bad code and learn how not to do things (hopefully).
    2. Learn lots of programming languages. They often have different and interesting ways of doing things that can teach you lessons that you can bring to any language. For example Haskell will teach you the benefit of keeping functions pure (and also the costs!).

    If you only know Python I would recommend:

    1. Learn Python with type hints. Run Pyright (don't use mypy; it sucks) on your project and get it to pass.
    2. Go is probably a sensible next step. Very quick to learn but you'll start to learn about proper static typing, multithreading, build tools (Go has the best tooling too so unfortunately it's all downhill from here...), and you can easily build native executables that aren't dog slow.
    3. C++ or Rust. Big step up but these languages (especially C++) will teach you about how computers actually work. Pointers, memory layouts, segfaults (in C++). They also let you write what we're now calling "foundational software" (formerly "systems software" but that was too vague a term).
    4. Optionally, if you want to go a bit niche, one of the functional programming languages like Haskell or OCaml. I'd probably say OCaml because it's way easier (it doesn't force everything to be pure). I don't really like OCaml so I wouldn't spend too much time on this but it has lots of interesting ideas.
    5. Final boss is probably a dependently typed language like Lean or Idris. Pretty hardcore and not really of much practical use it you aren't writing software that Must Not Fail Ever. You'll learn loads about type systems though.

    Also read programming articles on Hacker News.

  • Clean Code was pretty effectively debunked in this widely shared article from 2020. We probably don't need to talk about it anymore.

    Frankly I'm surprised it was ever recommended. Some of the things it says are so obviously insane, why would anyone think it was good?

    My only guess is the title? "Your code sucks; maybe read this book that I haven't vetted about clean code." sort of thing?

    I'd say it would be good to have a modern replacement with good advice to recommend... But in my experience you can't really learn these things by reading about them. You have to experience it (and have good natural taste).

    This list of code smells is pretty decent at least: https://luzkan.github.io/smells/

  • Yeah he's dead wrong here. Even clang-format - easily the worst autoformatter I've used - is an order of magnitude more tolerable than no auto-formatting.

    Sure it might not always be as good as what a perfectionist human would produce, but it's sure as hell better than what the average human produces, and it means you don't have to waste time ranting like this.

  • I use Google Slides - you get better control over the layout.

  • Not on RISC-V. The registers don't really have an endianness. They're just bit vectors - you can't address within them.

    When you access memory the current endianness setting determines the mapping between the register value and the bytes in memory. It's the access that has endianness; not the register.

  • All three of those languages have library ecosystems at least as good as Python's. Typescript is just as easy to learn and as fast to write as Python. I don't see why you'd think Python is faster. If I add up all the time I've lost to Python's terrible tooling it's quite a lot slower!

    Rust is definitely harder to learn - I'll give you that. But once you have learnt it it's just as fast as Typescript and Python. Especially if your "fast to write" metric measures to when you program is correct.

  • I think Python is superficially easier since you don't have to declare variables, printing is a little easier, etc. And in some ways it is actually easier, e.g. arbitrary precision integers, no undefined, less implicit type coercion.

    But I agree JavaScript is generally a better choice. And it is actually more popular than Python so...

  • I think it's just because it is always recommended as an "easy" language that's good for beginners.

    The only other thing it has going for it is that it has a REPL (and even that was shit until very recently), which I think is why it became popular for research.

    It doesn't have anything else going for it really.

    • It's extraordinarily slow
    • The static type hints are pretty decent if you use Pyright but good luck convincing the average Python dev to do that.
    • The tooling is awful. uv is a lifesaver there but even with uv it's a bit of a mess.
    • The package system is a mess. Most people just want to import files using a relative path, but that's pretty much impossible without horrible hacks.
    • The official documentation is surprisingly awful.
    • Implicit variable declaration is a stupid footguns.

    The actual syntax is not too bad really, but everything around it is.

  • I mean... C is a low bar. You can write Typescript, Rust and Go code 5x faster than C too.

  • pip is easily the worst thing about Python. But now that we have uv I would say the worst thing is the package/import system. I'm pretty sure only 1% of developers understand it, and it only really works properly if your Python code is a Python package.

    If you treat Python as a scripting language and just scatter loose files around your project and run them directly, it doesn't work at all. Pain everywhere. Which is dumb as fuck because that's like 80% of how people use Python.

  • Very easy to install

    This has to be a joke.

  • Unlikely, you'd do packet processing in hardware, either through some kind of peripheral or if you're using RISC-V you could add custom instructions.

  • He's right. I think it was really a mistake for RISC-V to support it at all, and any RISC-V CPU that implements it is badly designed.

    This is the kind of silly stuff that just makes RISC-V look bad.

    Couldn't agree more. RISC-V even allows configurable endianness (bi-endian). You can have Machine mode little endian, supervisor mode big endian, and user mode little endian, and you can change that at any time. Software can flip its endianness on the fly. And don't forget that instruction fetch ignores this and is always little endian.

    Btw the ISA manual did originally have a justification for having big endian but it seem to have been removed:

    We originally chose little-endian byte ordering for the RISC-V memory system because little-endian systems are currently dominant commercially (all x86 systems; iOS, Android, and Windows for ARM). A minor point is that we have also found little-endian memory systems to be more natural for hardware designers. However, certain application areas, such as IP networking, operate on big-endian data structures, and certain legacy code bases have been built assuming big-endian processors, so we have defined big-endian and bi-endian variants of RISC-V.

    This is a really bad justification. The cost of defining an optional big/bi-endian mode is not zero, even if nobody ever implements it (as far as I know they haven't). It's extra work in the specification (how does this interact with big endian?) in verification (does your model support big endian?) etc.

    Linux should absolutely not implement this.

  • Well that just seems silly. You're ok with an inconvenient ad-hoc collection of mandatory IDs, but not a unified one.

  • Yeah you'd think, but that was always the reason that was given.

  • They certainly tried (see Poetry, Pyenv, Conda, etc.). But that was mostly done by Python developers in Python, which is frankly the entire problem.

  • You already need some kind of proof of identity to work and housing. This just designates one identifier as the one you have to use. Hardly different.