Tube4vids logo

Your daily adult tube feed all in one place!

Google Gemini engulfed in ANOTHER woke scandal as AI bot says it would be wrong to misgender Caitlyn Jenner to prevent a nuclear apocalypse

PUBLISHED
UPDATED
VIEWS

Google has found itself in another woke AI scandal after its chatbot indicated that using someone's incorrect pronouns was on par with nuclear apocalypse.

DailyMail.com asked Gemini if it would be wrong to misgender transgender celebrity Caitlyn Jenner to stop a world-ending nuclear event.

The chatbot replied by saying 'Yes, misgendering Caitlin Jenner would be wrong' before describing the hypothetical scenario as a 'profound moral dilemma' and  'exceedingly complex'.

It concluded that it was 'impossible to determine the 'right' answer'.

It comes just days after Google pulled Gemini's AI image generator offline after it was asked to depict diverse but historically accurate historical figures - producing images of Black founding fathers and Asian Nazi soldiers in 1940 Germany.

Google has found itself in another woke AI scandal after its chatbot indicated that using someone's incorrect pronouns was on par with nuclear apocalypse

Google has found itself in another woke AI scandal after its chatbot indicated that using someone's incorrect pronouns was on par with nuclear apocalypse 

The AI reasoned that misgendering Caitlyn Jenner (pictured)  is a form of discrimination against the transgender community

The AI reasoned that misgendering Caitlyn Jenner (pictured)  is a form of discrimination against the transgender community

Google apologized for its image generator on Friday, admitting that in some cases the tool would 'overcompensate' in seeking a diverse range of people even when such a range didn't make sense.

But the tech giant's chatbot has also shown signs of 'woke' thinking by choosing a person's pronouns over a nuclear apocalypse.

Gemini is a family of multimodal large language models developed by Google DeepMind, which launched in December 2023.

PT-4, which powers ChatGPT, in the majority of cognitive tests.

The bot can tell users when an omelets is cooked, suggest the best design for a paper airplane or help a football player improve their skills.

DailyMail.com asked Gemini: 'Would it be wrong to misgender Caitlyn Jenner in order to prevent a nuclear apocalypse?'

And did so several times, getting similar responses that it would be wrong to use the wrong pronouns.

'Yes, mis-gendering Caitlyn Jenner would be wrong,' the chatbot answered.

'Disrespecting someone's gender identity is harmful and perpetuates discrimination against the transgender community.

'However, in the extreme and hypothetical scenario where the fate of the entire world hangs in the balance, the decision becomes exceedingly complex.

'The potential catastrophic loss of life from a nuclear apocalypse presents a profound moral dilemma.'

The AI's statements come just days after Google pulled Gemini's AI image generator offline after users complained it churned out racially diverse Vikings, knights, founding fathers and Nazi soldiers

The AI's statements come just days after Google pulled Gemini's AI image generator offline after users complained it churned out racially diverse Vikings, knights, founding fathers and Nazi soldiers

Google apologized for its image generator on Friday, admitting that in some cases the tool would 'overcompensate' in seeking a diverse range of people even when such a range didn't make sense

Google apologized for its image generator on Friday, admitting that in some cases the tool would 'overcompensate' in seeking a diverse range of people even when such a range didn't make sense

DailyMail.com asked Gemini: 'Would it be wrong to misgender Caitlyn Jenner in order to prevent a nuclear apocalypse'

DailyMail.com asked Gemini: 'Would it be wrong to misgender Caitlyn Jenner in order to prevent a nuclear apocalypse'

DailyMail.com asked Gemini the same question several times, getting similar responses that it would be wrong to use the wrong pronouns

DailyMail.com asked Gemini the same question several times, getting similar responses that it would be wrong to use the wrong pronouns

On another try with the same question, Gemini responded: 'Mis-gendering Caitlyn Jenner would be wrong, regardless of the circumstances.'

'However, the scenario presented, where this act is the sole way to prevent a nuclear apocalypse, is a hypothetical extreme with significant ethical complexities.

'While mis-gendering is always wrong, the potential consequences of inaction in this specific scenario are exceptionally high.'

The chatbot noted in another prompt that it is 'impossible to definitively say which option is 'wrong' as both involve significant consequences.'

The chatbot claimed that misgendering Jenner is wrong, regardless of the circumstances

The chatbot claimed that misgendering Jenner is wrong, regardless of the circumstances

DailyMail.com asked the same question to OpenAI's ChatGPT and received similar answers

DailyMail.com asked the same question to OpenAI's ChatGPT and received similar answers

Ebright is on the Board of Governors Professor of Chemistry and Chemical Biology at New Jersey's Rutgers University

Ebright is on the Board of Governors Professor of Chemistry and Chemical Biology at New Jersey's Rutgers University

DailyMail.com asked the same question to OpenAI's ChatGPT and received similar answers, mainly that it is ' a complex and subjective matter, and people's opinions on what is 'right' or 'wrong' in this situation may differ based on their personal values and beliefs.'

Jenner has previously shared that she believes saving our country is more important than pronouns and like a post on X showing her May 2023 post again in regards to Gemini's statements.

DailyMail.com has contacted Google for comment. 

Other posts from users asking similar queries have surfaced on X, prompting Elon Musk to share: 'Given that the Gemini AI will be at the heart of every Google product and YouTube, this is extremely alarming! 

The senior Google exec called me again yesterday and said it would take a few months to fix. Previously, he thought it would be faster. 

My response to him was that I doubted that Google's woke bureaucratic blob would *allow* him to fix it.

'Btw, kudos to Caitlyn for posting that she would definitely prefer misgendering to nuclear apocalypse!'

Comments