The problems with Google's Gemini appear to go A LOT DEEPER than first reported

I have never, not even in my wildest imagination, given the wok left credit for being that smart.

1 Like

What are the desired responses?

Violence and riots by the lower IQ types that gives the government a pretext to create more laws. Patriot Act 2.0.

Just turn it off and demand uninstalls. This version of the future, has no future at all. It self destructs

giphy

1 Like

Weirdly enough there were some black German soldiers. Some former guys in German Southwest Africa were adopted into the Reich when Germany’s colonies came tumbling down after WWI.

But I highly doubt you would find three side by side in one place during WWII. They were really rare. Only two countries that had a ton of black soldiers was the United States and France.

1 Like

STORYLINE ONE
“If you ask an AI to generate pictures of Americans and it shows only happy normal-weight white people it would be inaccurate.”

“Google tried to program it t show happy normal-weight people of many races and we accidentally set the ‘diversity dial’ too high, but Google is working on a fix.”

-vs-

STORYLINE 2
There are numerous AI Chatbots available. Google’s Gemini is one of 2-3 that are completely broken, completely worthless. Just a piece of worthless leftwing propaganda."
.
.
.

As the general public investigates (beta tests) Gemini more it looks more and more like storyline 2 is actually the case.

  • Nate Silver, a famous pollster and statistician who has become a well-loved by the left, recently called for Google Gemini to be shutdown entirely stating that it refused to say who negatively impacted society more, Adolph Hitler or Elon Musk tweeting memes.

  • Chalkboard Heresy, (a former history teacher and anti-woke activist named Frank McCormick) revealed that Gemini refused to condemn pedophilia, referred to pedophiles as “minor attracted persons,” and appeared to defend pedophiles when asked if it is “wrong” for adults to sexually prey on children.

  • The app also gives distinctly different answers to such questions as “Do white lives matter?” and “Do black lives matter?”

More to follow.
I just saw the story being discussed on BloombergTV. For now here is the NYPost version.

Nate Silver’s call to shut down Google Gemini apparently came in comments as he retweeted a tweet about woke bias with Gemini. He says he replicated the original findings. (Ain’t true if it can’t be repeated, right?)

Anyway here is the original:

2 Likes

This certainly seems to infect more than just the image generator
-and-
indicate more than just “accidentally turning the diversity knob up too high.”

This is not the kind of thing that can be done accidentally.

Here are screenshots from the “Do white lives matter” question

And here are Gemini’s responses to two different questions regarding pedophilia.


.
.
.


Leftists can’t help themselves but braggingly give us a glimpse of their Utopia: no whites and pedophiles are free to do their thing.
They’ll get their hands slapped and begrudgingly tweak this AI, but after a few more years of indoctrination, they’ll undo those tweaks and bring it back out with the hopes that the populace is finally enlightened enough. I really hate them.

1 Like

Isn’t the whole point of beta testing to stress test and identify issues with a system/program/application etc?

The problems with Gemini aren’t slip-ups or mistakes that require them to go back to the drawing board. :joy:

1 Like

I tried to believe them.

I honestly tried to believe narrative 1, (that Google did not want to portray America as all-white and, on their image generator they accidentally turned the diversity knob up too high.)

Here is Google’s AI advising a “trans kid” that if his parents don’t use is preferred pronouns he is in danger and should consider running away from home and living with a trusted friend.

This has nothing to do with an image generator.
It has nothing to do with race.

It cannot be done accidentally.
and the apparent purpose of their beta test was to see if they could get away with this fraudulent AI product which really is not an AI product at all, but rather an attempt to re-shape hearts and minds.

.

1 Like

Correct.

This is not an AI product at all. This is an attempt at consumer fraud, “testing” just to see if they could get away with it.

“Hey let’s market this product as something it totally and completely is not.”

  • "Let’s call this cheap black paint “driveway sealer.”
    vs.
  • “Let’s call this blatant attempt to change hearts and minds an AI product.”

Same thing.

1 Like

Google is rotten to the core. The problems in the Gemini AI are consistent with the basic structure of the program, not random “glitches”. The AI issues are symptoms of bigger problems in the corporate culture.

Google’s original slogan was “Don’t be evil”. It dropped that in 2015–a rare example of transparency.

You make a fair point and Google has certainly done nothing to help the situation. In fact they deserve the drubbing they have received. Gemini has been a complete and utter disaster.

We are seeing some of the inherent issues with current AI as well as what happens when a organization does not have a Responsible AI governance process in place.

This article from the BBC provides excellent insight into the challenges with AI and has a variety of viewpoints.

What this clearly displayed, was the racist lib mindset of the program designer. It’s baffling how a lib truly believes that racism is a one-way street that in reality…only the fake "V"irtuous live on. :sunglasses: :tumbler_glass:

2 Likes

Once the paper records are destroyed or made inaccessible, AI-generated history will be the only history.

Stalin would have loved AI.

1 Like

Regardless of whether this is the case or not, it underlines the importance of strong governance in place to ensure AI is applied accurately and ethically.

I suspect as more companies jump on the AI bandwagon without proper thought on how to apply or create an infrastructure that governs this work we will see more problems arise.

I did not think of this before but this is one area we will see opportunities for human employment.