There is a machine learning bubble, but the technology is here to stay. Once the bubble pops, the world will be changed by machine learning. But it will probably be crappier, not better.

What will happen to AI is boring old capitalism. Its staying power will come in the form of replacing competent, expensive humans with crappy, cheap robots.

AI is defined by aggressive capitalism. The hype bubble has been engineered by investors and capitalists dumping money into it, and the returns they expect on that investment are going to come out of your pocket. The singularity is not coming, but the most realistic promises of AI are going to make the world worse. The AI revolution is here, and I don’t really like it.

  • abraxas@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I think the idea is that someone buying a basic book on foraging mushrooms isn’t going to know who the experts are.

    They’re going to google it, and they’re going to find AI-generated reviews (with affiliate links!) of AI-generated foraging books.

    Now, if said AI is generating foraging books more accurate than humans, that’s fine by me. Until that’s the case, we should be marking AI-generated books in some clear way.

    • norb@lem.norbz.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Now, if said AI is generating foraging books more accurate than humans, that’s fine by me. Until that’s the case, we should be marking AI-generated books in some clear way.

      The problem is, the LLM AIs we have today literally cannot do this because they are not thinking machines. These AIs are beefed-up autocompletes without any actual knowledge of the underlying information being conveyed. The sentences are grammatically correct and read (mostly) like we would expect human written words to read, however the actual factual content is non-existent. The appearance of correctness just comes from the fact that the model was trained on information that was (probably mostly) correct in the first place.

      I mean, we should still be calling these things algorithms and not “AI” as “AI” carries a lot of subtext in people’s minds. Most people understand “algorithms” to mean math, and that dehumanizes it. If you call something AI, all of a sudden people have sci-fi ideas of truly independent thinking machines. ChatGPT is not that, at all.