One of the stupidest things I’ve heard.
I also would like to attach the bluesky repost I found this from:
https://bsky.app/profile/leyawn.bsky.social/post/3lnldekgtik27
One of the stupidest things I’ve heard.
I also would like to attach the bluesky repost I found this from:
https://bsky.app/profile/leyawn.bsky.social/post/3lnldekgtik27
I don’t see any reason why this can’t be discussed. I think people here are just extremely anti AI. It is almost like forcing AI on people was a bad idea.
I think there’s a useful discussion for why these technologies can be effective at getting people to connect with them emotionally, but they themselves don’t experience emotions any more than a fictional character in a book experiences emotion.
Our mental model of them can, but the physical representation is just words. In the book I’m reading there was a brutal torture scene. I felt bad for the character, but if there was an actual being experiencing that kind of torment, making and reading the book would be horrendously unethical.
i don’t even understand why it’s worth discussing in the first place. “can autocomplete feel?” “should compilers form unions?” “should i let numpy rest on weekends?”
wake me up when what the marketers call “ai” becomes more than just matrix multiplication in a loop.
If it a broad discussion of intelligence then I could see it.
I do agree that we are no where close anything that resembles actual intelligence
Nobody forced it on anybody. My work uses gdocs, I just never turned Gemini on. Easy.