I’ve been rocking the original Acer X34 since 2014 and feel like upgrading again. Specifically the AW3423DWF tickles my fancy, but I’m struggling to decide whether or not it is worth it on Mint without HDR support.

I’ve only been running Linux for about a year and have gotten quite comfortable with Mint, but see that I’d need to change distro if I want to use the Plasma DE, which is the only (?) one with decent HDR support at the moment?

Do any of you run HDR capable monitors in Linux?
If yes: is it worth the purchase even if I stick to SDR mode or would you recommend re-rolling distro to get support today?
If I change it up, I’m looking at Fedora.

Thanks in advance!

  • CrazyLikeGollum@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    14 hours ago

    OLED alone even without HDR adds a noticeable difference in contrast ratio. Meaning blacks look blacker even when right next to bright whites. HDR improves that, provided you have HDR content to enjoy.

    An issue with some (much) older OLEDs was burn in, but at least in my experience, with more modern displays that seems to be much less of an issue. A lot of displays have a burn in reduction feature on board that seems to generally work well and the actual LEDs have gotten more durable as the tech has advanced.

    I have an OLED display hooked up to an old rpi running my homeassistant control panel. It’s been displaying an essentially static image for nearly two years without any burn in.

    Personally, I’d recommend an OLED monitor. If you can afford it, go for high resolution and high refresh rate. If you primarily watch video prioritize resolution, if you primarily game prioritize refresh rate. Though you may have issues going over 120Hz on Linux.

    As for your DE, Mint should support KDE Plasma and you should be able install it like any other package. Might be worth looking up a guide for that. However, I won’t recommend against switching to Fedora. It’s what I use and I haven’t had any notable issues and their documentation seems pretty solid.

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    To me, if I had to get a new monitor, it would 100% have to be 120 Hz at 4K OLED with HDR.

    My TV and smartphone are both HDR with high refresh rates and it really puts my laptop and desktop monitors to shame.

  • Zarlin@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 day ago

    Totally worth it even in SDR, the pure blacks are so much better than any other form of backlighting, and even without HDR the colors are much more vibrant.

    • Bronzie@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      Cheers mate. Monitors at work aren’t exactly made for popping colours, so I have probably convinced myself the old dog at home is better than it actually is

  • juipeltje@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    If i had the funds for an oled, it would probably be still worth it to me. I’m personally more concerned about burn-in

    • Luca@feddit.it
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      I would love to get an OLED but the risk of burn in scares me to death. I don’t think I’ll ever get one until this issue is fixed.

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 day ago

        I don’t think I’ll ever get one until this issue is fixed.

        I mean, LEDs degrade over time. That’s just kind of a fact of life. LED lightbulbs, flashlights, etc. The LED in the backlight of an LCD monitor does too — it just degrades evenly across all pixels, so you don’t get a burn-in effect. Just makes the monitor get dimmer over time (though with LCD monitors that use regional backlighting, I guess some regions could get dimmer before others).

        I don’t think that there will be some technology to totally stop LEDs degrading. My understanding is that they’ve done various things over past years to try to mitigate it. I listed tandem LEDs below, which multiplies how long it takes them to degrade, lets them be run at lower power.

        Maybe someday someone could track power-on time per subpixel element and model decay of each for the long term and using that data, jack up power on each to compensate for degradation on a per-subpixel element level.

        EDIT: We also had burn-in on CRTs, and used those for ages. Didn’t prevent use of CRTs. I don’t know what the rate of burn-in relative to OLEDs was, but it was real.

        kagis

        https://lunduke.substack.com/p/what-video-games-are-burned-into

        A bunch of CRT arcade monitors with burn-in bad enough that you can see it with the monitor off.

        Just used screensavers or switched off monitors, and eventually, if a monitor became sufficiently problematic, tossed it and got a new one.

        All that being said, if I could have significantly-improved longevity on an OLED display, it’d be nice.

      • Bronzie@sh.itjust.worksOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        I have that too, but being a dad limits my gaming time so much that I reckon the monitor is obsolete, technology wise, by the time it becomes an issue.

        Time will tell I guess

  • Psythik@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    15 hours ago

    No. I wouldn’t even bother with a modern display unless it’s a 4K 120Hz HDR OLED at minimum.

    That said, keep in mind that Linux’s support for HDR is limited at best, and I believe that only KDE supports it. That means no RTX HDR and no AutoHDR. So if a game or video doesn’t have native support for HDR, you can only play/watch it in SDR. Which is a damn shame, cause even SDR content looks amazing when converted, especially in the highlights.

    If you don’t mind dual-booting, I’d recommend Windows 11 for SDR games, movies, and YouTube, until Linux gets its own conversation tools. You can also use Win10 (or LTSC), but then you’ll only get RTX HDR because 10 doesn’t support AutoHDR (which isn’t a huge deal because RTX HDR can replace AutoHDR in most games).

  • FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    I use an OLED HDR TV (LG C1) in Arch.

    It is very worth it, even more so with Proton10 adding HDR support for gaming (without needing to use gamescope).

    Obviously, if you’re going to buy a good HDR-capable display then you’re going to want HDR to work. A monitor doesn’t need HDR to look good, but HDR feels like a graphical upgrade that doesn’t cost you any frame rate.

    If you’re comfortable enough with Mint, most of that knowledge will transfer over to other flavors of Linux.

    I’d recommend something that allows you to use the most current software. Arch, for example. I know it has a reputation as being difficult to install but it is very much worth doing as it gives you a lot of hands-on work with the inner workings of Linux. It will take some time (I think it took me the whole day the first time) but the installation guide will walk you through it.

    That being said For most people, I think an Arch install is an excellent project for a VM, or second piece of hardware. For your main PC, you just want it to be up and running as quick as possible so you can keep using it.

    EndeavourOS is an Arch-based distro that uses a graphical installer and chooses a decent set of default packages for a desktop PC. That makes the installation of Arch much faster and you’re not left to research every little subsystem in order to figure out what packages you need.

    • Bronzie@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      I have been dreaming of dabling with Arch, but I just don’t feel ready for it, if that makes sense?
      With kids, work and life getting in the way, I just don’t have the hours to tinker with stuff like I used to.

      Maybe in a few years when they are older.

      I have set up a few headless homeservers with Debian. Hopefully that experience will help then

      • gnuplusmatt@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        15 hours ago

        there’s plenty of up to date, feature rich distros on the scale between mint and arch. Fedora or ublue/bazzite are also good options for example

  • lurch (he/him)@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    1 day ago

    HDR is not for the type, but for tge resolution: Transitioning from one color to another over 3000 pixels will often result in very visible steps where each region of one color ends and the next begins. HDR can reduce these steps significantly. You need HDR less, if you have only 1080p, regarless if OLED, LCD or whatever. But if you have 4K and the display fills a lot of your field of view (meaning it’s big or you’re close) it can become super annoying without HDR. Of course, some people don’t mind either way. Maybe check it by visiting a store and getting close to the displays. Ask them to switch HDR off and on.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    1 day ago

    One thing I might consider is whether you want a tandem OLED monitor. They’re out now, but not widespread, and that monitor is not one, and it’s one area where OLED monitors are improving significantly. My guess — not following closely — is that they will be more-widespread before long.

    The main drawback of OLED monitors on desktops today from my standpoint is that there is some burn-in potential, and the longer one plans to keep the monitor — and you’ve kept your last for some time — the more potential. Running the LEDs at a lower brightness level reduces the impact.

    Tandem OLED monitors use multiple LED layers, which combine their brightness. The rationale is more that it lets one have brighter output — OLED displays aren’t presently as bright as LCD displays, and more brightness is nice, especially for TVs. However, it also means that each layer can be driven at less power, helping mitigate burn-in. They’re also somewhat more power-efficient, since efficiency falls off at brighter levels.

    For me, brightness isn’t a big deal on desktops, since I’m not needing to use them outdoors, and they don’t need to outshine the sun. And I don’t much care about power efficiency on a desktop monitor. But putting off burn-in would be nice.

    I haven’t seen burn-in on my OLED phone after years of use, but I also don’t use my phone as much as a computer can be, and I understand that the longer daily use of computers is expected to be a more-significant factor.

    It is possible to get visible burn-in on existing OLED monitors after a lot less than a decade of use:

    https://hothardware.com/news/qd-oled-burn-in-testing-one-year-results

    There are currently tandem OLED monitors out with five layers.

    On a phone, I myself wouldn’t worry about it, but I also tend to keep desktop monitors for a lot longer than phones.

    • Bronzie@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Comprehensive write up. Thanks for this!

      The main reason I went for it now was that the panel has been out a few years and the price is more in line with what I’d like to spend. I’m guessing this new tech will be quite costly to begin with?
      The early adopter tax thankfully no longer gives me a high, but I will read up on these and maybe they will be relevant next time around.
      Seems like interesting tech, for sure!

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 day ago

        I’m guessing this new tech will be quite costly to begin with?

        They cost more, yeah. Probably come down as they become more widespread.

        I’m not gonna say “don’t get a single-layer OLED display” — that’s a value call for you. Just saying that you kept the last one for eleven years, and if you plan to keep this one for another eleven years, you might want to keep potential for burn-in in mind, given that we’ve got significant OLED display longevity improvements happening. Depends on a variety of factors, like brightness of display and whether you have static elements onscreen. For some people, it simply doesn’t matter.

        I kept my last monitor for about 15 years, so I’m inclined to favor longevity increases — switched because DVI was pretty much dead. But lots of people aren’t gonna do that, so…shrugs

  • RejZoR@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    Yes. OLED has perfect blacks and if you’re a gamer because of pixel response times. OLED has pixel response time of 0.03 ms. Best LCD monitors have pixel response time of 1 ms. OLED image will be sharp through entire motion distance. Most games don’t support HDR but still benefit from above features.

    • Bronzie@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Guess I’m just “worried” the upgrade in color accuracy and response time won’t be enough to warrant the purchase, but you all seem encouraging so I’m leaning towards upgrading. Cheers !

      • RejZoR@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        16 hours ago

        If you’re a gamer, then absolutely. If not, benefits will not be as obvious. For games, it’s a massive upgrade.

      • kewjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 day ago

        my recommendation if you do is to look at refresh rate, going 120hz to 240hz felt like a much bigger upgrade to me than sdr to hdr. especially if you’re playing games that depend on response times, it just feels smoother. hdr in Linux is decently there on kde but there’s still issues getting it to work everywhere like Firefox, though they still look nicer than regular sdr imo. also avoid hdmi, especially if using amd!

  • Ada@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    I use a AW34234DWF on plasma in HDR mode. I love it. It’s worth it. I do a lot of photo work with it. Mostly I don’t run games in HDR though, because it involves gamescope, which is more hassle than it’s worth for me

    • kewjo@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      I’ve been using scope buddy to manage my gamescope config, has auto resolution/hdr detection and you can set global defaults. you still have to pass scb -- %command% but it seems easier to manage. with proton 10 i set it to actually disable gamescope and use it to set the proton Wayland+hdr env variables and haven’t had any issues so far.

    • Bronzie@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Good to know that you are happy with it in SDR while gaming, which will be my biggest use-case. Cheers!

  • jonathan@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    I’m running HDR on Fedora 42 with Gnome. Unless you watch HDR video content or play HDR games I would say it’s not worth worrying about.

  • sunzu2@thebrainbin.org
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    OLED is worth it if you got the coin mate

    HDR implementation in monitors is no proper so it shouldn’t impact your decision IMHO

    If you care for HDR, I would wait until next gen where they helpfully get it implemented properly…

  • Fisch@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    I recently upgraded to a 4k monitor with HDR and shortly after that GNOME 48 came out with HDR and VRR support, so KDE Plasma and GNOME are the two desktops I know of that support HDR. I use GNOME and it works really well even though I do need to use gamescope if I want to play a Windows game with HDR and Firefox doesn’t (yet?) support it on Linux. It definitely looks really cool but it’s not a huge loss if you stick with Mint and just use SDR. It seems like you wanna get the monitor either way, so I’m pretty sure you can just use a live USB of something like Fedora to try HDR out without having to actually install anything. I’m just not sure what software you could try it out in because (at least to my knowledge) no browser supports HDR on Linux yet and you can’t just install a whole game on a USB stick.

    • Bronzie@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Yeah I have a few extra SSD’s in the system, so maybe I’ll give dual booting a crack later on.

      Thanks for sharing your experience!

    • Bronzie@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      Appreciate the input mate. The overwhelming response here was so positive that I just bit the bullet and ordered it.

      Life’s to short to have 11 year old monitors, ey?

      • nagaram@startrek.website
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        I also just swapped my monitor out after nearly 12 years with it.

        I think ANYTHING you would have bought new would have looked awesome. Panel tech has advanced.

  • Zikeji@programming.dev
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    I’m not running an OLED but my monitor is HDR capable and I prefer the look, however I don’t run it in HDR mode. Reason being, it fucks with my OBS recordings and I have to up the quality significantly for them to be usable, which ups the storage requirements.

    • Bronzie@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Just you being happy with SDR is a positive sign for me, so thanks for the feedback. What do you stream/record?

      • Zikeji@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Another reason I’m happy with SDR is because I run two monitors and the second doesn’t support HDR. So it provides a consistent look.

        As for recording - really just limited to when I play games like Lethal Company with friends. Just to clip the goofs. Have a whole shitton of them.