News Ticker

The nine shocking replies that highlight ‘woke’ ChatGPT’s inherent bias — including struggling to define a woman, praising Democrats but not Republicans and saying nukes are less dangerous than racism

Daily Mail | Feb. 12, 2023

ChatGPT has become a global obsession in recent weeks, with experts warning its eerily human replies will put white-collar jobs at risk in years to come.

But questions are being asked about whether the $10billion artificial intelligence has a woke bias. This week, several observers noted that the chatbot spits out answers which seem to indicate a distinctly liberal viewpoint.

Elon Musk described it as ‘concerning’ when the program suggested it would prefer to detonate a nuclear weapon, killing millions, rather than use a racial slur.

The chatbot also refused to write a poem praising former President Donald Trump but was happy to do so for Kamala Harris and Joe Biden. And the program also refuses to speak about the benefits of fossil fuels.

Experts have warned that if such systems are used to generate search results, the political biases of the AI bots could mislead users.

Below are 10 responses from ChatGPT that reveal its woke biases:

(***)

2 Comments on The nine shocking replies that highlight ‘woke’ ChatGPT’s inherent bias — including struggling to define a woman, praising Democrats but not Republicans and saying nukes are less dangerous than racism

  1. The AI bot that does the closed captions for Google, said in a phrase from a Congressmen referring to Biden’s letting the balloon spy on most military installations, He said he thought that Biden BETRAYS AMERICA. But the cc. says Biden portrays America. Big Difference. And you and I see how this is being portrayed.

Post a Comment

Winter Watch

Discover more from Winter Watch

Subscribe now to keep reading and get access to the full archive.

Continue reading