Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)F
Posts
1
Comments
54
Joined
2 yr. ago

  • Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes.

    An AI isn't going to magically know these things, because these aren't AIs based on brain scans preserving the person's entire mind and memories. They can learn only the data they're told. And fortunately, there's a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.

  • Sure, you should be free to make one. But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!), there are valid questions about whether that will cause them harm rather than help - and grieving people do not always make the most rational decisions. They can very easily be convinced that interacting with AI-you would be good for them, but it actually prolongs their grief and makes them feel worse. Grieving people are vulnerable, and I don't think AI companies should be free to prey on the vulnerable, which is a very, very realistic outcome of this technology. Because that is what companies do.

    So I think you need to ask yourself not whether you should have the right to make an AI version of yourself for those who survive your death... but whether you're comfortable with the very likely outcome that an abusive company will use their memories of you to exploit their grief and prolong their suffering. Do you want to do that to people you care about?

  • Just gonna say that I agree with you on this. Humans have evolved over millions of years to emotionally respond to their environment. There's certainly evidence that many of the mental health problems we see today, particularly at the scale we see, is in part due to the fact that we evolved to live in a very different way to our present lifestyles. And that's not about living in cities rather than caves, but more to do with the amount of work we do each day, the availability and accessability of essential resources, the sense of community and connectedness with small social groups, and so on.

    We know that death has been a constant of our existence for as long as life has existed, so it logically follows that dealing with death and grief is something we've evolved to do. Namely, we evolved to grieve for a member of our "tribe", and then move on. We can't let go immediately, because we need to be able to maintain relationships across brief separations, but holding on forever to a relationship that can never be continued would make any creature unable to focus on the needs of the present and future.

    AI simulacrums of the deceased give the illusion of maintaining the relationship with the deceased. It is certainly well within the possibility that this will prolong the grieving process artificially, when the natural cycle of grieving is to eventually reach a point of acceptance. I don't know for sure that's what would happen... but I would want to be absolutely sure it's not going to cause harm before unleashing this AI on the general public, particularly vulnerable people (which grieving people are.)

    Although I say that about all AI, so maybe I'm biased by the ridiculous ideology that new technologies should be tested and regulated before vulnerable people are experimented on.

  • There may not have been any intentional design, but humans are still meant to eat food, drink water, and breathe oxygen, and going against that won't lead to a good end.

  • As many a person has said before when an outlandish and unproven claim is made: pics or it didn't happen.

  • I don't particularly want to jump between a dozen different apps to have access to every single tool and filter I use, especially when even when using a single file format (PSD), not every app treats layers in the same way. In a detailed digital paint, you can very easily have hundreds of layers, so it's absolutely a deal-breaker if your layer groupings or group masks are destroyed when switching between apps.

  • That would probably work for hobbyists, but I have my doubts that professionals, who rely on Adobe products for their livelihood, could use unsuitable software for years in the hopes that volunteer devs will eventually add the features they need. In the other post about this topic, someone commented that GIMP's devs are refusing to fix problems that are repelling new users, which is not going to encourage Adobe users to make the switch. GIMP still doesn't have fully functioning, reliable non-destructive editing, which is 100% essential for anyone beholden to a boss or client who is going to change their minds a couple of times between now and next month.

    Adobe is big because of their userbase, but their userbase is big because they make genuinely powerful software that fits the needs of professionals. The free options (and the cheap proprietary options) are not there yet, and probably never will be. Professionals aren't going to switch until the features they need are there (because seriously, why would anyone use a tool for their job that doesn't actually allow them to do their job properly?), but the features aren't going to be added until the professionals switch over. Catch22.

  • Been a while since I used Krita, so it's hard to compare Krita from 3 or 4 years ago with Photoshop 2023, but it was okay. Better than GIMP, but unless there's been some major changes, it doesn't have anywhere near the versatility in tools and filters as Photoshop.

    This feels like the key difference between Photoshop and the others. There's an awful lot of stuff that previously I would have to do manually, sometimes over several hours, that Photoshop can do in seconds, either because there's a tool or filter for it, or sometimes just because Photoshop is so much more responsive. This is really hard to quantify in an objective way, far more so than pointing out whether a feature is present or absent, but... I use an art tablet and Photoshop just responds to the pen better.

    So like it's not really that it's impossible to do amazing work with the free apps, it'll just take a lot longer. I liked your analogy in your other comment, about the e-bike vs pickup truck: you definitely can move that half a ton of crushed stone with an e-bike, but it'll be quicker and less work with a pickup truck.

  • The one thing I've been dissatisfied with Photoshop for, in comparison to another app, is its traditional media analogues do not come even close to Painter's, and I've not been able to get any brushes set up in a way that replicates them. There's professionals that use Painter in addition to Photoshop because of that, and I expect I will as well - but I really notice the features missing that I use a lot in Photoshop.

  • I have to agree. I've used a great many software packages over the years, but having been given an Adobe Creative Cloud subscription by my university, as several of Adobe's programs are required for the degree I'm doing, I've been very annoyed to discover that the alternatives really aren't on the same level. They work, sure. You can get the job done with them. But I am genuinely finding Photoshop to be significantly more powerful than everything else I've used. And it's really annoying because I've never liked Adobe as a company.

  • The "Willa Wonka Experience" event comes to mind. The images on the website were so obviously AI-generated, but people still coughed up £35 a ticket to take their kids to it, and were then angry that the "event" was an empty warehouse with a couple of plastic props and three actors trying to improvise because the script they'd been given was AI-generated gibberish. Straight up scam.

  • UK citizens can also opt out, as the Data Protection Act 2018 is the UK's implementation of GDPR and confers all of the same rights.

    In my opt out, I have also reminded them of their obligation to delete data when I do not consent to its use, so since I have denied consent, any of my data that has been used must be scrubbed from the training sets and resulting AI outputs derived from the unauthorised use of my data.

    Sadly, having an Instagram account is unavoidable for me. Networking is an important part of many creatives' careers, and if the bulk of your connections are on Instagram, you have to be there too.

  • Well, let's see about the evidence, shall we? OpenAI scraped a vast quantity of content from the internet without consent or compensation to the people that created the content, and leaving aside any conversations about whether copyright should exist or not, if your company cannot make a profit without relying on labour you haven't paid for, that's exploitation.

    And then, even though it was obvious from the very beginning that AI could very easily be used for nefarious purposes, they released it to the general public with guardrails that were incredibly flimsy and easily circumvented.

    This is a technology that required being handled with care. Instead, its lead proponents are of the "move fast and break things" mentality, when the list of things that can be broken is vast and includes millions of very real human beings.

    You know who else thinks humans are basically disposable as long as he gets what he wants? Putin.

    So yeah, the people running OpenAI and all the other AI companies are no better than Putin. None of them care who gets hurt as long as they get what they want.

  • Had OpenAI not released ChatGPT, making it available to everyone (including Russia), there are no indications that Russia would have developed their own ChatGPT. Literally nobody has made any suggestion that Russia was within a hair's breadth of inventing AI and so OpenAI had better do it first. But there have been plenty of people making the entirely valid point that OpenAI rushed to release this thing before it was ready and before the consequences had been considered.

    So effectively, what OpenAI have done is start handing out guns to everyone, and is now saying "look, all these bad people have guns! The only solution is everyone who doesn't already have a gun should get one right now, preferably from us!"

  • AI programs are already dominated by bad actors, and always will be. OpenAI and the other corporations are every bit the bad actors as Russia and China. The difference between Putin and most techbros is as narrow as a sheet of paper. Both put themselves before the planet and everyone else living on it. Both are sociopathic narcissists who take, take, take, and rely on the exploitation of those poorer and weaker than themselves in order to hoard wealth and power they don't deserve.

  • The metaphoric argument is exactly on point, though: the answer to "bad actors will use it for evil" is not "so everybody should have unrestricted access to this really dangerous thing." Sorry, but in no situation you can possibly devise is giving everyone access to a dangerous tool the correct answer to bad people having access to it.

  • And probably also that water is hot and fire is wet?

  • Just don't complain when the world becomes even more shit than it already is. Open source AIs that rely on scraping content without paying the creator are just as exploitative of workers as corporate AIs doing the exact same thing.

  • It really is. I'm also not a huge fan of "everyone needs to have access to their own personal open source AI, otherwise only corporations will be able to use it", like somehow the answer to corporations being shit is to give everyone else a greater ability to be shit too. What the world really needs is even more shit!