Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)R
帖子
0
评论
185
加入于
3 yr. ago

  • I use 10ten (previously Rikuchamp) for Japanese. I don't think it does full translation, but it gives thorough dictionary lookups (from WWWJDIC) as mouseover tooltips. Very useful if you're trying to learn the language, but maybe not so much if you just want to read stuff quickly. I think it's now available for every major browser, but I mostly use it on FF.

  • I apologize, because between OP's post and looking at the OnlyOffice website, I got the impression that it was only a web app, requiring a web server to run. After reading another comment here I looked harder on the website and found the download links for the standalone versions.

  • Where are these conversations happening? I could see a lot of enterprise-focused groups potentially getting behind OnlyOffice, but individual home users? Not so much.

    EDIT: My mistake! I didn't realize that there are standalone versions of OnlyOffice in addition to the web app version.

  • Despite this, I still bet that they post "nvm fixed it" an hour or two later.

  • 已删除

    Permanently Deleted

    跳过
  • Whoops! When I looked at the second time that the shift value is calculated, I wondered if it would be inverted from the first time, but for some reason I decided that it wouldn't be. But looking at it again it's clear now that (1 - i) = (-i + 1) = ((~i + 1) + 1), making bit 0 the inverse. Then I wondered why there wasn't more corruption and realized that the author's compiler must perform postfix increments and decrements immediately after the variable is used, so the initial shift is also inverted. That's why the character pairs are flipped, but they still decode correctly otherwise. I hope this version works better:

     
        
    long main () {
        char output;
        unsigned char shift;
        long temp;
        
        if (i < 152) {
            shift = (~i & 1) * 7;
            temp = b[i >> 1] >> shift;
            i++;
            output = (char)(64 & temp);
            output += (char)((n >> (temp & 63)) & main());
            printf("%c", output);
        }
    
        return 63;
    }
    
      

    EDIT: I just got a chance to compile it and it does work.

  • I first learned about Java in the late 90s and it sounded fantastic. "Write once, run anywhere!" Great!

    After I got past "Hello world!" and other simple text output tutorials, things took a turn for the worse. It seemed like if you wanted to do just about anything beyond producing text output with compile-time data (e.g. graphics, sound, file access), you needed to figure out what platform and which edition/version of Java your program was being run on, so you could import the right libraries and call the right functions with the right parameters. I guess that technically this was still "write once, run anywhere".

    After that, I learned just enough Java to squeak past a university project that required it, then promptly forgot all of it.

    I feel like Sun was trying to hit multiple moving targets at the same time, and failing to land a solid hit on any of them. They were laser-focused on portable binaries, but without standardized storage or multimedia APIs at a time when even low-powered devices were starting to come with those capabilities. I presume that things are better now, but I've never been tempted to have another look. Even just trying to get my machines set up to run other people's Java programs has been enough to keep me away.

  • 已删除

    Permanently Deleted

    跳过
  • I don't know if this will work or even compile, but I feel like I'm pretty close.

     
        
    long main () {
        char output;
        unsigned char shift;
        long temp;
        
        if (i < 152) {
            shift = (i & 1) * 7;
            temp = b[i >> 1] >> shift;
            i++;
            output = (char)(64 & temp);
            output += (char)((n >> (temp & 63)) & main());
            printf("%c", output);
        }
    
        return 63;
    }
    
      
  • This genie must've read or watched Brewster's Millions.

  • I saw it at the cinema and vaguely remember enjoying it well enough. It's not a great movie, but it's not awful, either. I didn't know that it was supposed to be terrible; it looks like reviewers gave it a slightly better than average score.

    I don't expect ever to watch it a second time, if that helps.

    Lara Croft and the Cradle of Life, though... All I can remember about it now is that afterwards, my friends and I agreed that we should've trusted our instincts and just walked out after about 30 minutes.

  • That's still newer than any of my daily-use laptops that are all running full-featured Linux distros just fine. I got 'em all cheap secondhand, and just pumped up the RAM (12-16GB) and installed SSDs.

  • People are writing a lot of things that I agree with, but I want to chime in with two points.

    The first, which one or two other commenters have touched on, is that in 2024 we have approximately 50 years of content already in existence. There's no need to limit ourselves to what's been released in the last 12 months. Classic books, music, plays, and movies stay popular for decades or centuries. Why feel shamed out of playing old games by 12-year-olds and the megacorps?

    The second thing is, yes, try indie games, and IMO the best place to find them is for PCs on itch.io. Forget 95% of what's marketed as "indie" on consoles.

  • Did you read all the way to the end of the article? I did.

    At the very bottom of the piece, I found that the author had already expressed what I wanted to say quite well:

    In my humble opinion, here’s the key takeaway: just write your own fucking constructors! You see all that nonsense? Almost completely avoidable if you had just written your own fucking constructors. Don’t let the compiler figure it out for you. You’re the one in control here.

    The joke here isn't C++. The joke is people who expect C++ to be as warm, fuzzy, and forgiving as JavaScript.

  • Yeah, I'm sure that almost all of us have felt this way at one time or another. But the thing is, every team behind every moronic, bone-headed interface "update" that you've ever hated also sees themselves in the programmer's position in this meme.

  • When I go to that URL on a stock, direct FF install, I still see that notice.

  • Since you seem earnest, probably play_my_game or possibly gamedev.

  • I reserve further comments until I know whether you posted this in this community: a) deliberately but seriously, b) deliberately and sarcastically, or c) by accident.

  • Any time I need to learn something about JS, I go to W3Schools to wrap my head around the basics, then over to MDN for current best practice.

  • That was my first take as well, coming back to C++ in recent years after a long hiatus. But once I really got into it I realized that those pointer types still exist (conceptually) in C, but they're undeclared and mostly unmanaged by the compiler. The little bit of automagic management that does happen is hidden from the programmer.

    I feel like most of the complex overhead in modern C++ is actually just explaining in extra detail about what you think is happening. Where a C compiler would make your code work in any way possible, which may or may not be what you intended, a C++ compiler will kick out errors and let you know where you got it wrong. I think it may be a bit like JavaScript vs TypeScript: the issues were always there, we just introduced mechanisms to point them out.

    You're also mostly free to use those C-style pointers in C++. It's just generally considered bad practice.