Blog

First day of Autobotic(R)

Who am I?

Hulucination of chatGPT4

The following is an example of a hallucination of chatGPT when the details are not specific enough. As shown in the table below, I got the opposite answers for 2 close relevant questions. So my take away is that: whatever conclusion chatGPT makes, you may need to validate it by yourself! Questions to chatGPT Reponse…


Follow My Blog

Get new content delivered directly to your inbox.