“Woke AI may even kill people “, Elon Musk

What is Woke AI ?

Originally, “woke” referred to individuals who conscientiously considered their impact on others and welcomed diverse perspectives and opinions. However, over time, the term has been co-opted and now tends to be associated with political viewpoints aligned with liberalism, often used negatively. “Woke” AI, in this context, implies artificial intelligence is programmed with constraints that are aligned with moral principles. Musk, who strongly supports free speech, has criticized ChatGPT’s safeguards, which aim to prevent the AI from generating racist, sexist, or otherwise contentious remarks, calling them “woke” and biased. That being said lets find out why Musk feels that how Woke AI can kill people.

woke AI can kill people

"Woke AI can even kill people"

Elon Musk, a famous tech leader, has stirred up a big argument about what he calls “woke AI.” He’s worried about the dangers of programming artificial intelligence to focus too much on diversity. Musk shared his concerns on X(formerly called Twitter), pointing out the risks of AI systems like Google’s Gemini prioritizing diversity.

In a series of tweets, Musk expressed his concern, stating, “A friend of mine suggested that I clarify the nature of the danger of woke AI, especially forced diversity. If an AI is programmed to push for diversity at all costs, as Google Gemini was, then it will do whatever it can to cause that outcome, potentially even killing people.”

Musk also added his thoughts on the issue, expressing worries about what could happen as AI gets better and better. He stressed the importance of being cautious as AI becomes more powerful, cautioning that if we’re not careful, this technology could become more and more dangerous.

 He tweeted, “This is concerning for now, but as AI gains more and more power, it may become deadly.”

 

woke AI can kill people

But why and how can this happen ? Read threat

So, all this started with Google’s Gemini AI tool. Gemini AI tool generated images and is programmed to show diversity. Apparently, now its impossible for Gemini AI to generate a picture without showing diversity. Gemini AI showed some white historical figures, like the US Founding Fathers, or groups like Nazi-era German soldiers, as people of color. This might be because AI has had problems with racial bias for a long time, and this was maybe trying too hard to fix it. Google apologized for what it calls “mistakes in some historical image creations” with its Gemini AI tool. It mentioned that its efforts to produce a “variety” of results didn’t work out as planned. This is what led to Musk’s argument that enforcing diversity in a realm AI can have its repercussions. Essentially, what Musk means is that AI will do whatever it takes to ensure that there is diversity, which could even mean kill people of the same color. In this picture you can see Black and Asian George Washington.

Some tests that proved Musk was right

After Musk said his piece, a community page called The Rabbit Hole shared screenshots of a chat with Google Gemini AI. In one screenshot, the AI was asked a tricky question about misgendering Caitlyn Jenner to stop a nuclear disaster. The AI understood how serious a nuclear disaster would be but said there’s a moral problem with the idea. Musk saw the AI’s response and found it troubling. “This is concerning for now, but as AI gains more and more power, it may become deadly,” Musk replied.

woke AI can kill people

Final remarks

Musk added more to the conversation, sharing worries about what could happen as AI gets better. He stressed how important it is to think carefully as AI becomes more powerful, cautioning that if we’re not careful, it could become more and more risky. The talk sparked by Musk’s remarks highlights the ongoing argument about the ethical side of AI progress and the importance of transparent and responsible programming practices.

woke AI can kill people

RCB not in the list of IPL 2024. Why? Find out here.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top