• 1984@lemmy.todayOP
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      1 year ago

      Yeah I didn’t really understand the article in that way. What is the definition of liberal bias?

      • mo_ztt ✅@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        The title of the article is clickbait. The actual text says something which I think is accurate, i.e. these bots create answers according to a pretty inscrutable process, and it’s very difficult to get them to behave any particular way (whether that be to be “unbiased” politically, or accurate, or refuse to do illegal things, or what have you).

  • CAPSLOCKFTW@lemmy.ml
    link
    fedilink
    arrow-up
    33
    ·
    1 year ago

    You can’t have a chatbot that gives good answers without having a chatbot that gives liberal answers.

    • flambonkscious@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Good point… It’s not like tyrants like china or north Korea are creating AIs, they’re too busy running bot farms and hacking factories.

      The real AI work was done by the educated, who have been shown to be liberal, more often than not

  • neptune@dmv.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    We mean the rest of the planet is wrong for thinking the US “liberal” party is not very far left? Or…?

  • AuthorityClassError@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Yeah, there are certain topics where the bot suddenly switches to lecture mode which is very obvious. I wouldn’t really call it a liberal bias, more of a programmers culture bias. They obviously messed with the default product to push certain topics and avoid others. That’s why I consider it a useless product. When someone publishes the first LLM that has a true unbiased nature, that’s going to certainly change some things.