The biggest problem about AI is not intrinsic to AI. It’s to do with the fact that it’s owned by the same few people, and I have less and less interest in what those people think, and more and more criticisms of what the effect of their work has been.

  • LukeZaz@beehaw.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    I’m very much in agreement with Eno here, actually. I could imagine a world very easily in which LLMs and image generators didn’t just “have use cases,” but was actually revolutionary in more than a few of those cases. A world in which it was used well, for good things.

    But we don’t live in that world. We live in one where it was almost entirely born under and shaped by megacorps. That’s never healthy to anything at all, be it new tech or be it the people using it. The circumstances in which LLMs and generative models were developed was such that nobody should be surprised that we got what we did.

    I think that in a better world, image generation could’ve been used for prototyping, fun, or enabling art from those without the time to dedicate to a craft. It could’ve been a tool like any other. LLMs could’ve had better warnings against their hallucinations, or simply have been used less for overly-serious things due to a lack of incentive for it, leaving only the harmless situations. Some issues would still exist – I think training a model off small artists’ work without consent is still wrong, for example – but no longer would we face so much of things like intense electrical usage or de-facto corporate bandwagon-jumping and con-artistry, and the issues that still happened wouldn’t be happening at quite such an industrial scale.

    It reminds me how before the “AI boom” hit, there was a fair amount of critique against copyright from leftists or FOSS advocates. There still is, to be sure; but it’s been muddied now by artists and website owners who, rightfully so, want these companies to not steal their work. These two attitudes aren’t incompatible, but it shows a disconnect all the same. And in that disconnect I think we can do well to remember an alternate chain of events wherein such a dissonance might’ve never occurred to begin with.

    • P03 Locke@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I think that in a better world, image generation could’ve been used for prototyping, fun, or enabling art from those without the time to dedicate to a craft. It could’ve been a tool like any other. LLMs could’ve had better warnings against their hallucinations, or simply have been used less for overly-serious things due to a lack of incentive for it, leaving only the harmless situations.

      I think that world still exists, but it’s going to take corporations tripping over their hundreds of billions of dollars and burning it in a dumpster fire, before they realize that the technology isn’t theirs to keep and own.