Skip Navigation

Posts
11
Comments
275
Joined
8 mo. ago

  • This take gets so close to a nuanced view then doesnt quite make it there.

    TL;DR Generative AI is just a tool Authorship is still fundamental to Artistry. Artists will use AI to author great works, but most users will not. There is no shortcut to previsualization and taste.

    The core point is that AI Image Generation is a tool, a tool that is getting more tool-like every day. To use his analogy about Magic Lasso, it was a tool that made a step in the process of image editing faster and easier when it worked, but many people using Photoshop might never even touch it because they don't need or haven't learned its use case.

    Most people today using AI image generation do not know what ControlNet is, or how to successfully create a repeatable reference for a character to be generated.

    While the generalist application of the core technology "text to image" is easy to fool around with for the layperson to generate images they haven't pre-visualized it is an entirely different process to pre-visualize and then pursue the specific execution of an image. This is artistry, not putting a brush to paper but creating an idea and then executing it to the predetermined specification.

    As a necessary aside, yes there is artwork which is "spontaneous" "generative" or "Process-Driven" but in those cases the artwork is framed around the process and/or the concept. The story of its creation is fundamental to its value. "So I wrote a prompt and generated the image" isn't going to command much clout in the Fine Art world.

    To go back to the definition of an artist as one who conceives of and then executes artwork, and how this relates to the use of specialized AI tools, we need to discuss curation and producing.

    A curator is one who does not make art, but instead relates works of art to one another and exhibits them in context to perceivers of that art, they do not fundamentally create the value of the art, but can add to the value of the work through discussion, illumination, and exhibition. Much if the relation of individuals to artwork on the internet is mediated through the exchange of curation, that is to say we curate what we like when we repost the works of others and comment on them, and consume art online most often in this context of being curated by our extended social network.

    Contrast this with the role of the Producer in cinema, the producer's role is to curate talent to achieve a cohesive and successful work of art. They are almost never the Artist themselves, except in specific cases of Auteurs which I will touch on later, rather it is their job to match up Artists with each other to assemble a tram capable of executing an effective synthesis in the final work. The director works the Actors into. the shape of the performance they have conceived, the Production Designer creates the physical or digital space in which the performances take place, etc.

    Is your general member of the public typing prompts into an image generator more of an Artist, a Curator, or a Producer? To me there is no one answer, instead the person is taking more of the approach of one of these by way of their process.

    Now we come to Auteurship, in cinema when there is one artist who has fully conceived of the final work of a film in every detail, and takes on the role of final decision maker in every aspect of the artwork, taking authorship from the others working on the film and placing it solely on. themselves, we call them an Auteur. The Auture is usually the Director, but very occasionally might be the Producer, or even the Writer, but in all cases they are the ones who have created ahead of time in their mind's eye the specific aesthetic, pacing, story, performance, and even score of the film and are striving to bring the reality of what is being created as close as possible to that vision, everyone else on the production is then placed in the role of technician, or perhaps craftsperson. The Auteur is the Artist, and the film is their work.

    The last thing to say about the Auteur is this: there are many good Director, Producers, and Writers, but very few good Auteurs.

    This is the case for AI, as the tools are developed to create specific, predictable outcomes using generative tools, the burden of technical execution deminishes, but the burden of Authorship, of Artistry, can not be deminished.

    Generative AI is just a tool, but it is vision and storytelling that will enable its output to be Art and recognized as such by Curators and the public, and as with every other art, the process will be fundamental to its value, and that process will not be a single offhand written prompt into one generalized algorithm.

  • He did, he talked about generating backgrounds for works so you can focus on illustrating the subject. A brief touch om the subject, but he acknowledged AI is a tool that can be used as a part of a process rather than replacing the process entirely, and that those two things are different.

  • This is the entire goal of Palantir, presictive policing using mass surveillance data on citizens. Tie in digital ID and you have an even worse version of the CCP's Social Credit system.

  • The project just launched and is a software-first project. We won't see a Libre Phone available for a while yet.

  • If the owners primarily want to make money by taking out a portion of revinue as dividends or distributions, like a family business typically does, then stable revenue is more important in some ways than reinvesting in growth.

    If the ownership wants to make money by eventually selling their stake (shares or equity) in the company then growth is fundamental to the strategy.

  • Excellent article! The invention of clean room procedures for environmental testing probably exceed even his successful campaign to ban lead in gasoline as far as positive impact on society. Clair Patterson is a true hero of science.

  • It's all just supper after all.

  • Killdozer 2: Super-Killdozer, the Wreckoning

  • Dangit, forgot to lead with TIL, ah well

  • huh, interesting, a private text based internet protocol where everything runs server side and all sends are encrypted by default. I see the appeal.

  • The biggest difference is that the Dot Com bubble was strongly focused on tech companies going public and pumping small cap stock prices up.

    The AI bubble on the other hand is almost entirely being built by private equity, with the largest players all privately held but with large cap stock companies holding substantial stakes. Rather than a bunch of small companies getting pumped up stock prices of many multiples of their debut price then falling to zero, instead we have large cap stock companies bumping up their value substantially, but not by major multiples, while the actual value of the biggest players in AI are all speculative and can't be invested in by retail investors.

    This is all by design, the financiers of the AI boom are well aware that a public stock oriented rush into AI for retail investors would lead to massive speculation and an inevitable crash, instead with all the retail money going into large cap stocks they hope to capture that value and funnel the money into buying long term gains by making sure that those big companies have some stake in the "winning" private companies. When the first big AI companies go bust, they will be consolidated into their investor groups and harvested for innovation to transfer over to the winners.

    Overall this strategy seems sound to avoid a major retail stock bust, but isn't wothout its own risks, for example if open source AI ends up winning out and the biggest private players fall flat they could become toxic assets and drag down the large cap stocks, and thereby the Indexes and Index funds in favor of leaner players. In the current landscape, that would mean Microsoft going down with OpenAI while Apple goes up, Apple is waiting on the sidelines with a huge cash warchest, ready to buy.

  • (the best) Local LLMs are FOSS though, if bias is introduced it can be detected and the user base can shift away to another version, unlike centralized cloud LLMs that are private silos.

    I also don't think LLMs of any kind will fully replace search engines, but I do think they will be one of a suite of ML tools that will enable running efficient local (or distributed) indexing and search of the web.

  • What was controversial about joining Futo?

  • When OpenWrite says "publish to the open web, Gemini, or Mastodon" what does it mean by Gemini?

  • Chonky

    Jump
  • What a healthy looking critter!

  • You're joking right? "making up answers" in the case of search results just means a dead link. If you get a good link 99% of the time and don't have to use an enshitified service, that's good enough for 99% of people. Try again is the worst case scenario.

  • Also, you know what would make this all even worse? Laws requiring that people prove their identity in order to consume content or pull videos... just like age verification laws now being passed in several countries. What a coincidence.

  • Not to mention that the scraped indexes can and should be shared. Unfortunately what OP is seeing may be a move to thwart this type of brute force scraping, and might resolve as dynamically assigned domain addresses, where the URL of a set object is temporarily assigned and streamed only to a single or group of IP addresses that request it within a given timeframe before being rotated out until found in search again and then reassigned a new URL, etc. This is a frankly stupid use of resources, but can effectively be used to prevent crowdsourced indexes from proliferating, and to punish IPs or even MAC addresses or browser fingerprints associated with downloading and reuploading videos which almost certainly have stegnographic fingerprinting embedded that associate with who the video was served up to at the time it was downloaded.