not per cell. per tower, sure, but a tower contains tens of individual cells. i've been to places where there are just two or three cells on days where there are a couple of hundred people and the signal just craps out.
there's also MIMO, which lets one device relay a 5G signal for another. i don't think most people appreciate losing battery to route someone elses call though.
without reading the article: it's because every device connected to a cell halves the bandwidth available to each device, and after something like 30-40 devices the cell is full. so it hands over to another available cell, which may also be full. which means that if you're in a crowd of thousands, your phone desperately bounces off of every cell it can see and keeps getting rejected, boosting its signal every time to get farther away.
this is why we have mobile cell towers for events.
openai actually released some figures on power use per prompt, but the caveat is that a single prompt to their services can trigger multiple responses (the "thinking" mode) so they're not consistent.
sure, hardware wattage × time taken per prompt. which model specifically are you referring to and on what hardware?
Edit:
say, for example, that i'm running a model that takes ten seconds to respond on my Radeon 7900 XTX. it's power limited to 300W, but the rest of the system also pulls power during runtime so let's call it 400.
to get watt-hours we take watts times hours. one second is 1/3600th of an hour.
that comes out to 400 × 10 ÷ 3600 ≈ 1.11Wh. so that's equivalent of leaving a 6W LED light on for about 11 minutes, or an old-fashioned incandescent bulb on for 80 seconds.
idk if the organised military arm of a recognised state can even be classed as terrorists. i'd have gone with "oppressors", personally. but the point of this community is to post news articles without editorialising them.
i don't believe in wifi, just like i don't believe in trees. i know they're there. that requires no belief.