As we all know, AC won the “War of the Currents”. The reasoning behind this is that AC voltage is easy to convert up/down with just a ring of iron and two coils. And high voltage allows us to transport current over longer distances, with less loss.

Now, the War of the Currents happened in 1900 (approximately), and our technology has improved a lot since then. We have useful diodes and transistors now, we have microcontrollers and Buck/Boost converters. We can transform DC voltage well today.

Additionally, photovoltaics produces DC naturally. Whereas the traditional generator has an easier time producing AC, photovoltaic plants would have to transform the power into AC, which, if I understand correctly, has a massive loss.

And then there’s the issue of stabilizing the frequency. When you have one big producer (one big hydro-electric dam or coal power plant), then stabilizing the frequency is trivial, because you only have to talk to yourself. When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

I wonder, would it make sense to change our power grid from AC to DC today? I know it would obviously be a lot of work, since every consuming device would have to change what power it accepts from the grid. But in the long run, could it be worth it? Also, what about insular networks. Would it make sense there? Thanks for taking the time for reading this, and also, I’m willing to go into the maths, if that’s relevant to the discussion.

  • Ebby@lemmy.ssba.com
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    1
    ·
    4 months ago

    I heard it said many years ago that if DC won the battle, we’d have power stations every 10 miles and power lines as thick as your wrist.

    Converting local power is fairly easy, with AC inverters added for universal compatibility.

    But, take note of how many DC voltages you use in your house. Devices in mine range from 3v to 25v and some weird one like 19v for a laptop. You’d still have adapters all over the place.

    • explore_broaden@midwest.social
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      4 months ago

      But, take note of how many DC voltages you use in your house. Devices in mine range from 3v to 25v and some weird one like 19v for a laptop. You’d still have adapters all over the place.

      This is probably true, but every single one could lose the rectifier part, and instead of having to convert from pulsating DC (the output of mains rectification), you get clean DC from the wall instead, which should allow for using smaller capacitors in many places.

    • gandalf_der_12te@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      8
      ·
      4 months ago

      Okay, these are short term problems. “power lines as thick as your wrist” depend on the voltage. If voltage conversion works well enough, that issue disappears.

      But, take note of how many DC voltages you use in your house. Devices in mine range from 3v to 25v and some weird one like 19v for a laptop.

      Yeah, that’s why we need some kind of standard for these things.

      • Ebby@lemmy.ssba.com
        link
        fedilink
        English
        arrow-up
        16
        ·
        4 months ago

        Ha! Yes! Even today USB 5 volts is pretty sweet for low power stuff. USB PD re-complicates things, but it’s not user dependent so that’s a plus.

        And you need a loooot of copper to prevent voltage drop especially when a grid of 100 houses 1/2 mile long draw 20-80 amps each. The math starts adding up real quick.

        • bastion@feddit.nl
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          4 months ago

          I mean, you need a lot of voltage to make voltage drop irrelevant. Like, 120 or 240 volts. If distribution is voltage is the same dc/ac, we could use the same wiring (but different breakers, and everything else).

          So the wiring argument doesn’t really hold up - the question is more about efficient converters to reduce voltage once it’s at the house.

          I.e., for typical American distribution, it’s 240 in the neighborhood and drops to 120 in the house. If the dc does the same, the same amount of power can be drawn along existing wires.

          • Quatlicopatlix@feddit.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            Yea have fun transmitting a decent amount of power with 240v over a meaningfull distance. Also most generators produce ac anyways so why would you recitify it at the generator instead of your device after a transformer? You still need all kinds of different voltages everywhere in your electronics and this means you still need to regulate it.

            I am not shure how the american wirering worls out but to get from 240 to 120 you still need a transformer… or is it 240v between the different phases and then 120 from phase to neutral?

            • bastion@feddit.nl
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              240 in the neighborhood - i.e., that’s enough to distribute from the pole to a few houses. Of course you have higher voltages to go longer distances. This is equally true for AC vs DC. Thus, the idea that it takes a looot of copper for DC is erroneous.

              In fact, where conductor size is relevant is that you can use smaller conductors for DC, because of the skin effect.

              Wiring: Split phase, that is also usable as 240 for large appliances. So, the latter.

              • Quatlicopatlix@feddit.org
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                Skin effect with 50hz yea, no not much.

                Ok so every time you change the voltage level you still need a transformer and a inverter to create ac, so no it doesnt make any sense.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    49
    ·
    4 months ago

    DC is used for long-range transmission in high-voltage DC (HVDC) transmission lines today.

    https://en.wikipedia.org/wiki/High-voltage_direct_current

    high-voltage direct current (HVDC) electric power transmission system uses direct current (DC) for electric power transmission, in contrast with the more common alternating current (AC) transmission systems. Most HVDC links use voltages between 100 kV and 800 kV.

    HVDC lines are commonly used for long-distance power transmission, since they require fewer conductors and incur less power loss than equivalent AC lines. HVDC also allows power transmission between AC transmission systems that are not synchronized. Since the power flow through an HVDC link can be controlled independently of the phase angle between source and load, it can stabilize a network against disturbances due to rapid changes in power. HVDC also allows the transfer of power between grid systems running at different frequencies, such as 50 and 60 Hz. This improves the stability and economy of each grid, by allowing the exchange of power between previously incompatible networks.

    However, since grids are AC, it’s just to send power to a grid or pull from one.

    We also do have some increasingly beefy DC in individual households in some forms:

    • You mention solar PV systems, but more generally, 12V systems used in vehicles (and the related 24V and 48V systems that are sometimes used to push more power) are more common, with lithium batteries that can do many more charge cycles than lead-acid being available.

    • USB PD can negotiate pushing up to 240W now at 48V, which is a fair bit.

    • gazter@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago
      • USB PD can negotiate pushing up to 240W now at 48V, which is a fair bit.

      So if I wanted to wire my home to take advantage of this, supposing I had a house battery on solar, would I have some kind of DC-DC converter from battery to 48V, then cable to outlets with some kind of USB PD adaptor? How much advantage do I get from this, vs using existing 240V outlets + wall wart?

  • empireOfLove2@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    37
    ·
    4 months ago

    Well, most all DC generators these days are actually AC alternators with the output rectified, because alternators can be run a lot more efficiently. So you’re already losing on efficiency there.

    You need to consider the consumer side as well. Dinky residential loads like your computer would be fine on DC. But most of the world, especially heavy industry, runs on synchronous or induction AC motors, big ones. Big huge tens-of-megawatts motors that often run upwards of 97% line efficiency, which is insane for any industrial process.
    The best you could replace those with would be modern brushless DC motors, which require really expensive inverter controls that die frequently due to the magnetic transients and still top out at an efficiency of only 90% if you’re lucky. And that would incur huge costs that just aren’t worth it.

    • BearOfaTime@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 months ago

      The best you could replace those with would be modern brushless DC motors, which require really expensive inverter controls that die frequently due to the magnetic transients

      Wow, is this why my new brushless cordless tools have had more issues than my 25 year old cordless tools?

      • empireOfLove2@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        4 months ago

        I mean, yes and no. A lot of that is modern tools are going to be more carefully engineered to operate as close to failure as possible, as to advertise more power with a cheaper device. Thry have small wires and encoder sensors that can be prone to failure.
        Yes, the driving electronics are also sensitive, however magnetic transients are less of a deal on the scale of a cordless drill. When dealing with huge motors, those can be significant multi-kilovolt spikes that make solid state components Very Very Mad.
        But the brushless motors in a drill also do not have brushes that wear down rapidly in a very dirty/dusty contaminated environment like older power tools would. So it’s a bit of a 50/50.

        • BearOfaTime@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          4 months ago

          spikes that make solid state components Very Very Mad

          Hahaha, now I’m picturing an IC with an angry face just before it farts out Magic Smoke.

          Yea, I’ve had a couple new impact drivers needing the controller replaced, I assume they’re a package.

          While my 25 year old, abused impact of the same brand keeps chugging along, eating 2x as much battery for the same job.

          • empireOfLove2@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            Hahaha, now I’m picturing an IC with an angry face just before it farts out Magic Smoke.

            Well, that’s basically how they behave too lol. Solid state power components are generally not very tolerant and require careful surge suppression and filtering to not have them blow up frequently.

            I bet if you took that 25 year old driver apart, sanded off the commutator rotor, and put new brushes in it you’d suddenly find it’d have more power and use less battery. (And thats something you can do with older tools!)
            When brushed motors get old and oxidation/dirt builds up the resistance across the brushes to the rotor coils grows and you’ll lose motor efficiency.

    • cygnus@lemmy.ca
      link
      fedilink
      English
      arrow-up
      16
      ·
      4 months ago

      Same, that alone is reason enough to stick to AC IMO. It’s so much safer for the end user (or their kids who stick a fork into the outlet).

      • RouxBru@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 months ago

        AC lets you go if you shock, DC keeps you pulled in. I’m sure if you google this there’d be a video or 2, but it’s going to be ugly

        • Pelicanen@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          What do you mean AC “lets you go”? AC causes muscle contractions which keep you from, for example, letting go of a live wire.

          • RouxBru@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            Look I don’t know the science behind it, and maybe I was just lucky, but in my experience I’ve always been able to pull out of an AC shock. From what I’ve heard you don’t tend to be that lucky with DC

            • Pelicanen@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              DC actually has a higher “let go” threshold than AC does so you’d likely be more okay from a slightly higher voltage DC shock than a lower voltage AC shock.

  • SomeoneSomewhere@lemmy.nz
    link
    fedilink
    English
    arrow-up
    23
    ·
    4 months ago

    PV inverters often have around 1-2% losses. This is not very significant. You also need to convert the voltage anyway because PV output voltage varies with light level.

    Buck/boost converters work by converting the DC current to (messy) AC, then back to DC. If you want an isolating converter (necessary for most applications for safety reasons) that converter needs to handle the full power. If it’s non isolating, then it’s proportional to the voltage step.

    Frequency provides a somewhat convenient method for all parties to know whether the grid is over- or under- supplied on a sub-second basis. Operating solely on voltage is more prone to oscillation and requires compensation for voltage drop, plus the information is typically lost at buck/boost sites. A DC grid would likely require much more robust and faster real-time comms.

    The AC grid relies on significant (>10x overcurrent) short-term (<5s) overload capability. Inrush and motor starting requires small/short overloads (though still significant). Faults are detected and cleared primarily through the excess current drawn. Fuses/breakers in series will all see the same current from the same fault, but we want only the device closest to the fault to operate to minimise disruption. That’s achieved (called discrimination, coordination, or selectivity) by having each device take progressively more time to trip on a fault of a given size, and progressively higher fault current so that the devices upstream still rapidly detect a fault.

    RCDs/GFCIs don’t coordinate well because there isn’t enough room between the smallest fault required to be detected and the maximum disconnection time to fit increasingly less sensitive devices.

    Generators are perfectly able to provide this extra fault current through short term temperature rise and inertia. Inverters cannot provide 5-fold overcurrent without being significantly oversized. We even install synchronous condensers (a generator without any actual energy source) in areas far from actual generators to provide local inertia.

    AC arcs inherently self-extinguish in most cases. DC arcs do not.

    This means that breakers and expulsion type fuses have to be significantly, significantly larger and more expensive. It also means more protection is needed against arcs caused by poor connection, cable clashes, and insulation damage.

    Solid state breakers alleviate this somewhat, but it’s going to take 20+ years to improve cost, size, and power loss to acceptable levels.

    I expect that any ‘next generation’ system is likely to demand a step increase in safety, not merely matching the existing performance. I suspect that’s going to require a 100% coverage fibre comms network parallel to the power conductors, and in accessible areas possibly fully screened cable and isolated supply.

    EVs and PV arrays get away with DC networks because they’re willing to shut down the whole system in the event of a fault. You don’t want a whole neighborhood to go dark because your neighbour’s cat gnawed on a laptop charger.

    • BearOfaTime@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 months ago

      Oh wow, thanks for the detailed writeup. It’s a little above my pay grade (condensers used as localized generators? Wow, what an idea. They must be huge).

      Guess it’s time to find an Intro to Powergrids from The Teaching Company

      • gandalf_der_12te@lemmy.blahaj.zoneOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        I’ll give you a short introduction to the power grid (btw. it’s called “stromnetz” (electricity network) in german). The power grid has many “levels”, where each level represents a network of cables that transport current at a given, specific voltage. For example, you might have one 220kV level, and then a 5kV level, and a 230V end-consumer level.

        Between these levels, there have to be translations. These are “transformers” today, transforming high-level AC into lower-level AC or the other way around. For AC networks, they are basically a ring of iron and a few coils. However, for DC networks, other transformers exists, such as Buck/Boost converter.

        My question basically is: is there anyone who can give me experimental data on how well DC networks would work in practice? Personal experience is enough, it doesn’t have to be super-detailed reports.

        • SomeoneSomewhere@lemmy.nz
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          4 months ago

          I’m not sure there are any power grids past the tens-of-megawatt range that aren’t just a 2/3/4 terminal HVDC link.

          Railway DC supplies usually just have fat rectifiers and transformers from the AC mains to supply fault current/clearing and stability.

          Ships are where I would expect to start seeing them arrive, or aircraft.

          Almost all land-based standalone DC networks (again, not few-terminal HVDC links) are heavily battery backed and run at battery voltage - that’s not practical once you leave one property.

          I’m sure there are some pretty detailed reports and simulations, though. A reduction in cost of multi-kV converters and DC circuit breakers is essential.

    • gandalf_der_12te@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Thank you for this well-thought and balanced viewpoint. It took me 19 days to process all the information.

      So basically, I was wrong when I assumed that inverters had an efficiency of around 50%. That misunderstanding comes from the phrase that “filters in the inverter eliminate high-frequency components in the PWM’s output”. I thought they discard that power. But that’s apparently not the case. So the efficiency is more like >95%. So that’s good.

      • SomeoneSomewhere@lemmy.nz
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Even 95% is on the low side. Most residential-grade PV grid-tie inverters are listed as something like 97.5%. Higher voltage versions tend to do better.

        Yeah, filters essentially store power during one part of the cycle and release it during another. Net power lost is fairly minimal, though not zero. DC needs filtering too: all those switchmode power supplies are very choppy.

      • lemming741@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        4 months ago

        Is a square wave not AC? Current is flowing in and out of an inductor 100k times a second.

        Could that 100khz square wave excite a transformer and produce usable current on the secondary? Absolutely it could, and that’s how a bigger SMPS works.

        If you’re looking for a “pure DC to pure DC” converter, that’s called a linear regulator and it’s wildly inefficient. They work by varying the conductance of a transistor but are useful for low currents. The extra voltage is converted to heat.

        • nixcamic@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          4 months ago

          It’s more pulsed than alternating IMO. It never goes negative, and there isn’t a consistent frequency.

          • lemming741@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            4 months ago

            It depends where you measure. If you measure across the inductor, it absolutely goes negative.

            The frequency is generally fixed, the duty cycle will vary.

            A variable speed drive can be fed with DC. Is the output AC or DC? I know you need a three phase AC motor to wire up to it.

            Is audio DC? It doesn’t have a fixed frequency. Amplifiers pulse DC and then remove or ‘block’ the DC offset so speakers see AC.

            It seems like people in this thread have a very strict definition of AC being a 60Hz sine wave, and everything else must be DC.

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    4 months ago

    The grid does not work in 230V.

    It works from 10kV up to hundreds of kV. Most of your arguments do not count there.

    DC is good inside the house, and maybe to the next house. If I would build a new house today, I would build extra wires everywhere for AC and for DC 24V and 5V.

  • barsoap@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    4 months ago

    When you have one big producer (one big hydro-electric dam or coal power plant), then stabilizing the frequency is trivial, because you only have to talk to yourself.

    Your frequency is still influenced by a million and more consumers. And that’s before blind currents come into play.

    When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

    …and? Everyone is getting the current frequency via the network, everyone knows what it should be, everyone can do their own small part is speeding it up or slowing it down.

    The actual issue is that huge big turbines have lots of inertia which, through their inertia, naturally stabilise the frequency. On the flip side inverters (like with solar panels) can regulate the frequency actively, what’s iffy is smaller AC generators like wind mills. But then there’s also battery and capacitor banks.

    It’s a thing network engineers have to worry about, but it’s not some insurmountable problem. We’re already doing it. Insular networks have been doing it for ages, e.g. in Germany Berlin’s network wasn’t part of the eastern one and they always used stuff like capacitor banks to stabilise it.


    All that aside yes in the future there’s probably going to be a high voltage DC network in Europe. Less so for private consumers, at least not in the foreseeable future, but to connect up large DC consumers, that is, industry, with DC power sources. If you’re smelting aluminium with solar power going via AC is just pure conversion loss.

    • leds@feddit.dk
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 months ago

      what’s iffy is smaller AC generators like wind mills

      Not so iffy for bigger wind turbines, these also have significant inertia due to the mass of the rotor spinning (with large mass moment) and grid codes demand active grid stabilisation in most countries.

    • gandalf_der_12te@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      All that aside yes in the future there’s probably going to be a high voltage DC network in Europe. Less so for private consumers, at least not in the foreseeable future, but to connect up large DC consumers, that is, industry, with DC power sources. If you’re smelting aluminium with solar power going via AC is just pure conversion loss.

      Thank you, that was exactly what I was looking for. I know about aluminum production processes, and that it requires large amounts of DC power.

  • deegeese@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 months ago

    How does the efficiency and cost of buck converters compare to AC transformers? Seems like the cost and efficiency of the voltage converter should be the prime determinant, rather than specific applications of generation/consumption.

    What would a 400A 10kV utility scale DC converter look like?

    • gandalf_der_12te@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      4 months ago

      well, a large part why I asked the question is because I hope that somebody knows more about what buck/boost-converters can do today. I know they work well enough on small scales, but I have no experimental data for them on larger scales.

      I assume they would work well, but I’d like that somebody links me to the right datasheet or something.

      Edit: you have a very important point there. " Seems like the […] voltage converter should be the prime determinant, rather than specific applications of generation/consumption." YES. So, let me rephrase my question: does anybody have experience with high-power DC voltage converters?

  • aaaaace@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 months ago

    I lived un a house that had live 120v DC service.

    There was an electric fan that ran on it. The outlets were only in the basement and identical to each other.

  • not_woody_shaw@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 months ago

    Leave the grid out of it. Your PV’s charge your battery with DC, that gets used as DC by the USB-PD outlets in your wall. And you have an AC-DC converter for when you need to consume grid power.

  • TimeSquirrel@kbin.melroy.org
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    4 months ago

    When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

    That’s why you have standards and codes, that ensure everybody’s equipment is capable of syncing to the grid properly before they are allowed to connect. It’s not that hard for an inverter to do. Then you have the constant background supply to stabilize it like battery farms and other energy storage technologies. And a bunch of capacitor banks to correct power factor issues.

    I don’t think we are getting away from centralized production anytime soon. Even with the move to wind and solar, although I think nuclear should be included in that mix.

  • Petter1@lemm.ee
    link
    fedilink
    English
    arrow-up
    5
    ·
    4 months ago

    I learned in school, that these days, DC would in theory indeed be more efficient even if you generate a sinus at home for the then legacy devices.

  • Phoenix3875@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    4 months ago

    Power grids would mean long distance power transmission, so AC has an advantage. If the point of consumption is near the point of PV generation, DC can and is already being used.

    I know factories with solar panels on their rooftops to cut down power bills and instead of converting to high voltage AC, a custom-built DC power system is used.

      • Phoenix3875@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 months ago

        Sorry I wasn’t being clear. AC is used for connecting within areas of densely populated cities, e.g. British National Grid. If we are talking about really long distances (> hundreds of kilometers), HVDC is indeed preferred.

        I was talking about a trend of some factories replacing AC from power grids (possibility generated in nearby cities) with DC from solar panels on their rooftops. So it’s a long distance compared to that.

        • explore_broaden@midwest.social
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          4 months ago

          I don’t disagree that it is used, I just don’t think it really has an advantage in the modern day. However switching would be extremely difficult so the historically dominant AC continues.