Last month, the latest ChatGPT oopsie went viral – the fact that, if you asked it how many r’s were in the word “strawberry”, it would say there’s 2. However, get it to spell it out and it changed its mind and got it correct. For some people it would not only double down but would, when pushed, even change it’s mind and say it’s just 1.
However, it’s been a known issue since June.
But, now, it’s fixed. Or is it?
Here’s an example of how the original conversation would go…

Now, if I try, I get this…

Fixed, right? But, what I try and suggest otherwise?

So, yeah, first time it gets it right, but why would it change its mind and now get the answer wrong just because I said otherwise?
(The image at the top of this article was, intentionally, generated by AI)

Leave a Reply