Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)Z
Posts
1
Comments
118
Joined
2 yr. ago

  • Four years is not very long in a climate poised to undergo a phase transition unless we are able to reverse course very sharply. The IPCC is clear on this.

  • There's some compelling evidence it's also incompatible with economic growth despite this being the stated goal.

  • Yes. Of course wider changes like WFH and pandemic have affected things but specific policy choices have led to those trends being supported and reinforced in order to produce better air quality outcomes. Both those trends are present in lots of cities without this drastic impact.

  • If he retains an ownership stake doesn't that mean he still takes home the profit of Brewdog so boycotting would still be effective/desirable?

  • Agreed that the studios need to be held more accountable and their usage of AI is more problematic than open source last resort type work. I have noticed a degradation of quality in the last five years on mainstream sources.

    However, the existence of this last resort tool will shift the dynamics of the "market" for the work that should be being done. Even in the open source community. There used to be an active community of people giving their voluntary labour to writing subtitles for those that lacked them (they may still be active I don't know). Are they as likely to do that if they think oh well it can be automatically done now?

    The real challenge with the argument that it helps editors is the same as the challenge for Automated Driving. If something performs at 95% you actually end up deskilling and stepping down the attention focus and make it more likely to miss that 5% that requires manual intervention. I think it also has a material impact on the wellbeing of those doing the labour.

    To be clear I'm not anti this at all but think we need to think carefully about the structures and processes around it to ensure it does lead to improvement in quality not just an improvement in quantity at the cost of quality.

  • It is probably good that OS community are exploring this however I'm not sure the technology is ready (or will ever be maybe) and it potentially undermines the labour intensive activity of producing high quality subtitling for accessibility.

    I use them quite a lot and I've noticed they really struggle on key things like regional/national dialects, subject specific words and situations where context would allow improvement (e.g. a word invented solely in the universe of the media). So it's probably managing 95% accuracy which is that danger zone where its good enough that no one checks it but bad enough that it can be really confusing if you are reliant on then. If we care about accessibility we need to care about it being high quality.

  • Yes fair points. I assumed it was a balance between aerodynamics and crumple zones/legal requirements which is why they don't all look like the aptera (or Schlörwagens).

    I'm quite sure the system isn't optimising for what we want/need out of vehicles though and we could almost certainly do better.

  • They are similar as far as I understand because they all want the same outcomes of the design : better aerodynamics and effective crumple zones to faculiate higher survival of the occupants in a crash (some vehicles additionally try to limit injuries with pedestrians too but less so in US vehicles).

    I do agree that we have lost some of the majesty of older variations of designs but largely I think it'd convergent evolution. To leave that behind you'd want to have a really good reason which I don't think the cyber truck really has. Different for the sake of being different rather than innovative.

  • What I find frustrating about the current wave of "AI" is how much it obfuscates any meaningful discussion about the utility of different methods and approaches.

    Does Machine Learning or Machine Vision have a role in decarbonisation? Probably yes but it will require thought (and carbon accounting to make sure savings large enough!).

    Do LLMs or other GenAI techniques designed to pump out rehashes of existing images or text at tremendous energy cost? No.

    Are either of them "Artificial Intelligence" or are either of them likely to become "Artificial Intelligence"? No.

  • I read something about how the best outputs are done using a blend of make-up/models with CGI adding the layer of realism on top so pure CGI is worse but film studies pursue that because its cheaper and outsourcable compared with a heavy unionised make-up/prop workers.

  • Not sure how serious this comment is but these are anomalies against expected behaviour from models. These models include historical data with the addition of how we expect the changes we are making to impact it with the best knowledge we have of how the systems work.

    So its not saying its surprising that Australia is hot this time of year it's saying it markedly hotter than we expect or can explain using everything we understand about the climate.

  • Love this!

  • "Enjoys" is not how I would describe it.

  • Yes I agree that the headline and article is silly to reference memes and undermines the study as a whole which seems more sound.

    I know loads of people of take hundred of photos a day and then pay a cloud hoster (or use a "free" service) to store it indefinitely and never look back at it again.

    Cloud storage isn't straight forwardly just hard storage because its kept in data centers such that it can be downloaded at any point.

    Cloud storage is replacing any sense of needing a digital archivist processes for people and businesses because it much cheaper and easier to store it just in case the data is needed again rather than actually strategetically thinking about what data is important to keep and what isn't.

  • Very much so. We aren't winning until the taps are turned off

  • I'm sure its small - "AI" is an unnecessary waste of resources when we can ill afford it. That said we have actual quantifiable targets (that are so tough because we've left it so late) for energy and emissions so it might still be the case that this also needs to change.

    Sadly, ine of the things I hear quite a lot from people is the assumption that digital means it has no impact at all and they act accordingly to that assumption but when you add it up it is having a sizeable impact.

  • This is a consistent misunderstanding problem I wish people understood.

    Manufacturing things creates emissions. It costs energy and materials. Something could have absolutely no emissions in usage and still be problematic when done on growing scales because the manufacture costs energy emissions and resources. Hard drives wear out and die and need replacing. Researchers know how to account for this its a life cycle assessment calculation they aren't perfect but this is robust work.

    IT is up to 4% of global emissions and the sector is growing. People consistently act as if there is no footprint to digital media and there is. https://www.sciencedirect.com/science/article/pii/S2666389921001884

    Yes the headline is a little silly but we actually do need think strategically about the sector and that starts by actually realising it has an impact and asking ourselves what are the priorities that we went to save whilst we decarbonise the industry that supports it.

    There's no wiggle room left - no sector or set of behaviours that can afford to be given slack. We are in the biggest race of our life's and the stake are incomprehensibly huge.

  • The answer to your questions are: yes it's a different baseline to the one chosen by the Paris agreement, different baselines are chosen for relevant to different elements of the issue. Likely the baseline chosen in your link is down to what reliable data they have and so they choose a baseline from a region of data they have rather than going to other sources. This website provides the latest years official record in Paris Terms I would expect the next one (2024) to be much closer to 1.5°C. On (2) I agree that current measurements suggest an instantaneous/yearly temperature around 1.5°C against the relevant baseline. On (3) you are right that the trend is unlikely to change because it comes from radiative forcing (emissions) that have already occurred so even with sudden zero human emissions we would see an increase or best case a leveling (before maybe long term it can decline as CO2 is naturally removed from the atmosphere or faster if humans find a way of doing so at scale). A trend however is already an average of several time points and you can see in the link you said that year on year variation on that number can be as high as say ~0.3°C. This comes about from non-GHG forcing elements of the system (such as El Niño) that add natural variation. So already you could see 2019-> dropped by 0.2°C even though the trend is up. So you could expect us to potentially drop back down to say 1.2°C for a few years before it goes up again. The link above suggests the best data we have we would likely breach 1.5°C by 2031 so not long at all.

    This sounds like a pedantic point but it's actually quite important for the climate and the confusion stems back to how the problem and climate science was chosen to be communicated. Temperature was chosen in part because it's a proxy variable of other parts of the system that are what control the system impacts and it was felt that Temperature would be "naturally understandable" by the general population (and politicians...). This had a bit of a backfire because 1.5°C is not a lot of different when considered in say a room and it highlights why this variable is different and why it matters that it's decadal average rather than a yearly. So if temperature is only a proxy then what are the variables that control the outputs? One key one is the total heat energy stored in different earth systems and there the size of the storage medium matters (so the reason 1.5°C on the world is a lot but on a room isn't is because the sheer volume of the earth you have to have a huge amount more energy). The other place where Surface Temperature adds confusion and complexity is because of the oceans: the oceans have been absorbing some of the heat and that hasn't always been visible to us (as we don't live in the ocean) so if we stopped emitting today the ocean may then deposit some of that heat energy back into the atmosphere so it's a complex interaction. What we really need to know is what the additional level of radiative forcing and how much additional heat energy swimming about in Earth's systems - that is what will control the experience we have of the climate. Greenhouse gases act to stop Earth cooling back down by radiating out to space which is why the effect is cumulative so the difference between a sustained year on year 1.5°C and something that averages less but has a few years of 1.5°C is quite high because they will be different amounts of total energy in the system as a result.

    So, the short answer is that the Paris agreement targets are set on the basis on what a decadal rise of 1.5°C by 2100 (i.e the average 2090-2100) means in terms of the excess heat energy and radiative forcing in the system. The limit itself is somewhat arbitrary driven in part by the fact we were at ~1°C when it was agreed and 2°C seemed like a reasonable estimate of something we might be able to limit it to. The origin of 1.5°C rather than 2°C is actually quite interesting and highlights a lot about how climate change policy has been decided but this post is long enough.

    This is a good point. The sheet apocalyptic magnitude of the problem means that every tiny amount of change matters. Billions will die. There probably isn’t a way to prevent that completely anymore. But if we can tick things down by a fraction and save a few hundred thousand people, preserve a species of food crops that would have gone extinct, IDK what the exact outcomes are but the point is tiny changes will have a massive impact and they’re important even if the situation is dire.

    Agreed, I think this is the right way of thinking about it and the risk of having communicated it to the world as a binary target of 1.5C/2C we risk people completely switching off if/when we finally confirm we've breached it when the reality is it should embolden us further not demoralise us. This is my number one concern at the moment. I would also add that what we doing is "pushing" a system away from it's natural equilibrium and if we push hard enough we might find that we find changes in the system itself which are very hard or impossible to undo. So it's more than just more increase more damages it's also about risks of fundamentally and permanently changing the system.

    As an analogy think of the ball in the well of this local minima and we push it back and forth. If we hit it hard enough rather than come back it goes and finds another minima which is just a whole different system than we are used to. These are sometimes called tipping points and the frustrating thing about the complexity of the systems is we don't and can't know for sure where those points are (although we do know they increase heavily as you move above 1.5C upwards). They by definition are hard to model because models are built up from prior experience (data) and these are in part unprecedented changes in the atmospheric records.

    I haven't mentioned "negative emissions" technologies but it is worth saying in principle you could have a situation where we are able to do significant negative emissions and that might mean we could end up with 1.5C in 2100 whilst having a period of time above it but negative emissions technologies could be a whole other rant. Worth noting though that lots of the pathways that show we could just about keep to 1.5C do rely on negative emissions to different degrees (though also the pathways are limited in how much they think we might be able to push our economic systems).

  • I see this misconception a lot and it's really unfortunate. We aren't at what climate scientists call 1.5°C. Being at 1.5°C in the means the average anonomly being over 1.5 for a period of decades. It isn't just a case of scientists being cautious it a completely different impact in the climate. It implies different amounts of impacts and different levels of heat energy in the whole system.

    Yes we have hit 1.5°C over the last 12months partly down to el nino which is expected to subside shortly. Though there is some discussion about whether this year was an expected randomly anonomly or whether it suggests some feedback loop that's been underestimated but we can't know until enough time has passed (maybe a year).

    All that just means both that the impacts we are already saying are less worse than you'd expect at long term 1.5°C and therefore we should be extremely worried but also that we have factored that in in our estimates of what outcomes are possible (though the 1.5°C window is increasingly narrow because as you say we still have our foot on the gas). So there is still time to make an impact and every fraction of a degree and kg of CO2 matters.