Re: AI Hallucinating | <– Date –> <– Thread –> |
From: Lisa Kuntz (lisa.kuntz![]() |
|
Date: Sat, 28 Sep 2024 06:36:58 -0700 (PDT) |
Thanks, Sharon, very much appreciated! Lisa Kuntz Daybreak Cohousing Portland OR On Wed, Sep 25, 2024 at 10:50 AM Sharon Villines via Cohousing-L < cohousing-l [at] cohousing.org> wrote: > > For those who are unaware, ChatGPT regularly hallucinates, aka gives > plausible-sounding but fake answers. Its answers always need to be > fact-checked. > > One of the things that is wonderful for writers is the ability to use the > Web to immediately fact-check birthdates, event dates, name spellings, etc. > I don’t have to go to the library or keep a current set of encyclopedias to > quickly confirm a tiny and mostly irrelevant piece of information. For > someone who likes to do research and thinks in terms of a relational > database (with an imprecise memory) this is such a huge gift. > > But, I have no faith in AI, ChatGPT, etc. If you haven’t noticed, many > searches will now start with a ChatGPT entry as the first result. I > recently searched “famous FBI Director” because I couldn’t remember his > name. The response I got was "J. Edgar Hoover" with a picture of the > current FBI Director Christopher Wray. It took a few seconds to figure out > what I was seeing. > > (I tried to duplicate this today and it wasn’t happening. Google might > have had some kind of test going over the last two weeks and that doesn’t > seem to be in use now.) > > And, I also had a problem recently with my Chrome browser using > YahooSearch instead of Google, but I hadn’t changed any settings. Reddit is > becoming my go-to for weird things. The answer was that an extension that I > had downloaded from the Chrome Extension site had a bug/worm/virus that > switched all my searches to Yahoo. It wasn’t an odd addon from a geek > developer. I don’t know which one it was because I just deleted the ones I > hadn’t been using for years and years. Problem solved. > > Hal is here. > > Sharon > ---- > Sharon Villines, Washington DC > > "Give someone a book, they'll read for a day. Teach someone to write a > book and they will spend a lifetime mired in paralyzing self-doubt.” — > Lauren DeStafano > > _________________________________________________________________ > Cohousing-L mailing list -- Unsubscribe, archives and other info at: > http://L.cohousing.org/info > > > >
-
**Clarification on "Buyer Love Letter Ban" and AB 133 in California** Neil Planchon, September 25 2024
-
Re: AI Hallucinating Sharon Villines, September 25 2024
- Re: AI Hallucinating Lisa Kuntz, September 28 2024
-
Re: AI Hallucinating Sharon Villines, September 25 2024
Results generated by Tiger Technologies Web hosting using MHonArc.