ChatGPT is like kids! Makes mistakes in typing, if told, he says sorry!

[ad_1]

highlights

ChatGPT was launched in November last year
GPT-4 is the advanced version of ChatGPT
Chatbot made a mistake while answering a question

New Delhi. ChatGPT has become quite popular since it was launched by OpenAI in November last year. People were quite impressed by its ability to write big coding. Also, after introducing the advanced version of ChatGPT i.e. GPT-4 in March, people got even more hopeful. But, you will be surprised to know that GPT-4 also makes mistakes like humans and also says sorry.

A Reddit user has shared a screenshot of the conversation with GPT-4. It turns out that the chatbot made a typographical error while answering. Responding to a query titled ‘Pet Shop Recording Concern’, the chatbot wrongly wrote a word infringing and wrote infringing any local laws in its reply.

Chatbot made a mistake while answering a question Post- hairball201

But, when the user asked the chatbot what infringing meant, it corrected the mistake. The chatbot apologized and said that Infrishing is a typographical error. The correct word for this would be infringing, which means breaking a rule.

read this also: Android users will soon get a cool feature, it will be known from the home screen itself – where has your ride reached?

ChatGPT’s App
Let us tell you that recently ChatGPT’s iOS app has been made available in 32 countries after the US. The name of India is also included in the list of these countries. At present, the Android version of the app has not been introduced. Till now this chatbot was available only for website. However, with the advent of the app, people will have a lot of convenience in using it.

Tags: artificial intelligence, social media, tech news, tech news in hindi

[ad_2]

Leave a Comment