Re: AI Hallucinating
From: Sharon Villines (sharonsharonvillines.com)
Date: Wed, 25 Sep 2024 10:50:40 -0700 (PDT)
> For those who are unaware, ChatGPT regularly hallucinates, aka gives 
> plausible-sounding but fake answers. Its answers always need to be 
> fact-checked.

One of the things that is wonderful for writers is the ability to use the Web 
to immediately fact-check birthdates, event dates, name spellings, etc. I don’t 
have to go to the library or keep a current set of encyclopedias to quickly 
confirm a tiny and mostly irrelevant piece of information. For someone who 
likes to do research and thinks in terms of a relational database (with an 
imprecise memory) this is such a huge gift. 

But, I have no faith in AI, ChatGPT, etc. If you haven’t noticed, many searches 
will now start with a ChatGPT entry as the first result.  I recently searched 
“famous FBI Director” because I couldn’t remember his name. The response I got 
was "J. Edgar Hoover" with a picture of the current FBI Director Christopher 
Wray. It took a few seconds to figure out what I was seeing.

(I tried to duplicate this today and it wasn’t happening. Google might have had 
some kind of test going over the last two weeks and that doesn’t seem to be in 
use now.)

And, I also had a problem recently with my Chrome browser using YahooSearch 
instead of Google, but I hadn’t changed any settings. Reddit is becoming my 
go-to for weird things. The answer was that an extension that I had downloaded 
from the Chrome Extension site had a bug/worm/virus that switched all my 
searches to Yahoo. It wasn’t an odd addon from a geek developer. I don’t know 
which one it was because I just deleted the ones I hadn’t been using for years 
and years. Problem solved.

Hal is here.

Sharon
----
Sharon Villines, Washington DC

"Give someone a book, they'll read for a day. Teach someone to write a book and 
they will spend a lifetime mired in paralyzing self-doubt.”  — Lauren DeStafano

Results generated by Tiger Technologies Web hosting using MHonArc.