Anyone firing employees because they thought that AI would do their jobs in 2025 should be fired. It really doesn’t take much research to see AI isn’t at the place where it’s replacing people – yet. And business managers – particularly in small and mid-sized companies – who think it is better think again.

At best, generative AI platforms are providing a more enhanced version of search, so that instead of sifting through dozens of websites, lists and articles to figure out how to choose a great hotel in Costa Rica, fix a broken microwave oven or translate a phrase from Mandarin to English, we simply ask our chatbot a question and it provides the best answer it finds. These platforms are getting better and more accurate and are indeed useful tools for many of us.

But these chatbots are nowhere near replacing our employees.

It’s somewhat akin to claiming that now that we have hammers, carpenters aren’t needed.

  • Korhaka@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 hours ago

    I have worked with people that could be replaced with a small bash script. It’s more a question of when and how many

  • RickRussell_CA@beehaw.org
    link
    fedilink
    English
    arrow-up
    9
    ·
    13 hours ago

    With respect to the article, it’s wrong. AI help desk is already a thing. Yes, it’s terrible, but human help desk was already terrible. Businesses are ABSOLUTELY cutting out tier 1 call center positions.

    LLMs are exceptionally good at language translation, which should be no surprise as that kind of statistical chaining is right up their alley. Translators are losing jobs. AI Contract analysis & legal blacklining are going to put a lot of junior employees and paralegals out of business.

    I am very much an AI skeptic, but I also recognize that people who do the things LLMs are already pretty good at are in real trouble. As AI tools get better at more stuff, that target list of jobs will grow.

  • P03 Locke@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    7
    ·
    17 hours ago

    Oh, geesh, where have I heard this pattern before with technology? Is it self-driving cars? Remember when the sky was falling because everybody said car transportation was going to change overnight because a bunch of fucking college students figured out how to win a self-driving car race. Where the hell is my fully-autonomous Level 5 self-driving car that was going to replace all of the truck driving jobs? Oh, what’s that? Are all of the jobs still here?

    Technology moves the needle. It does not jam it all the way to the other side. Stop spending trillions of dollars on pipe dreams, only to have to force expectations back to reality ten years later.

    • Pete Hahnloser@beehaw.orgOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      16 hours ago

      You can have your Level 5 car just as soon as we’ve tackled jet packs and fusion reactors. Priorities!

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      4
      ·
      12 hours ago

      Whose LLMs?

      Content farms and SEO experts have been polluting search results for decades. Search LLMs have leveled the playing field: any trash a content farm LLM can spit out, a search LLM can filter out.

      Basically, this:

        • jarfil@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          8 hours ago

          All of them. The moment they summarize results, it automatically filters out all the chaff. Doesn’t mean what’s left is necessarily true, just like publishing a paper doesn’t mean it wasn’t p-hacked, but all the boilerplate used for generating content and SEO, is gone.

          Starting with Google’s AI Overview, all the way to chatbots in “research” mode, or AI agents, they return the original “bulletpoint” that stuff was generated from.

          • VagueAnodyneComments@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            The premise that AI enhances search is false. The stated barriers to “AI” adoption for small businesses are dated and false. The statement that LLMs and associated technology will become more accurate and reliable is false.

            The accurate statement in the article, that AI has no impact on earnings or hours, is from an outside source.

            So you see there is nothing of value provided by the article itself, because the article is propaganda designed to convince you that LLMs have a productive future and are presently useful for applications such as search. These are both lies. The article is lying.

        • jarfil@beehaw.org
          link
          fedilink
          arrow-up
          3
          ·
          11 hours ago

          Can you elaborate? It does match my personal experience, and I’ve been on both ends of the trash flinging.

    • sculd@beehaw.org
      link
      fedilink
      arrow-up
      12
      ·
      1 day ago

      Companies that care about quality cannot replace workers with LLM.

      Problem is some “executives” think they can save cost by “AI”, and they are trying.

      • Midnitte@beehaw.org
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 day ago

        They’re going/are replacing workers - the problem is they’re going to make someone else do more work and check/fix the output.

        In the end, there won’t be any cost savings (or frankly, even any “productivity”) - just another tool companies pay for because every other company uses it.

        • sculd@beehaw.org
          link
          fedilink
          arrow-up
          8
          ·
          1 day ago

          That’s the problem. I am already seeing AI slop in my area of work. And they usually need heavy clean up. In the end, its not saving any time.

      • Ledericas@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 hours ago

        AI can replace management, they are about the most least useful in a company.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        2
        ·
        12 hours ago

        It almost certainly already has replaced several.

        Has it actually replaced them?

        Sure maybe some people have lost their jobs, but I don’t think they’ve really been replaced.

        It’s closer to laying someone off without replacement … because evening I’ve seen has suggested AI not only can’t do the work but it also doesn’t improve the productivity of workers using it in any meaningful way.

        Also, AI is not synonymous with LLM.

        I’d argue that’s all AI means anymore if it has any meaning left at all.

        • Opinionhaver@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          The term artificial intelligence is broader than many people realize. It doesn’t refer to a single technology or a specific capability, but rather to a category of systems designed to perform tasks that would normally require human intelligence. That includes everything from pattern recognition, language understanding, and problem-solving to more specific applications like recommendation engines or image generation.

          When people say something “isn’t real AI,” they’re often working from a very narrow or futuristic definition - usually something like human-level general intelligence or conscious reasoning. But that’s not how the term has been used in computer science or industry. A chess-playing algorithm, a spam filter, and a large language model can all fall under the AI umbrella. The boundaries of AI shift over time: what once seemed like cutting-edge intelligence often becomes mundane as we get used to it.

          So rather than being a misleading or purely marketing term, AI is just a broad label we’ve used for decades to describe machines that do things we associate with intelligent behavior. The key is to be specific about which kind of AI we’re talking about - like “machine learning,” “neural networks,” or “generative models” - rather than assuming there’s one single thing that AI is or isn’t.

        • jarfil@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          11 hours ago

          A lot of people have been working tedious and repetitive “filler” jobs.

          • Computers replaced a lot of typists, drafters, copyists, calculators, filers, clerks, etc.
          • LLMs are replacing receptionists, secretaries, call center workers, translators, slop “artists”, etc.
          • AI Agents are in the process of replacing aides, intermediate administrative personnel, interns, assistants, analysts, spammers salespeople, basic customer support, HR personnel, etc.

          In the near future, AI-controlled robots are going to start replacing low skilled labor, then intermediate skilled ones.

          “AI” has the meaning of machines replacing what used to require humans to perform. It’s a moving goalpost: once one is achieved, we call it an “algorithm” and move to the next one, and again, and again.

          Right now, LLMs are at the core of most AI, but AI has already moved past that, to “AI Agents”, which is a fancy way of saying “a loop of an LLM and some other tools”. There are already talks of moving past that too, the next goalpost.

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      Meanwhile companies keep pushing “AI” (as in, LLMs integrated with image/video generation, STT and TTS, networking, file generation and reading, etc.), as traditional, useful ML, built for one purpose and fulfilling that purpose at least well, sinks into irrelevancy.