ChatGPT Disadvantage Lawyer Faces Trouble After Using AI For Research Shows Fake Cases That Never Existed

[ad_1]

ChatGPT: Are you using Chat GPT or any other AI chatbot? If yes, then you need to be careful. If you are collecting any fact with the help of AI, then you should verify that fact by doing some hard work. Yes, we believe that AI chatbots Bard or ChatGPT give almost accurate answers, but cannot guarantee that the answers are always correct. In such a situation, we need the human mind, so that the facts can be verified. Recently, a case has come to the fore in which ChatGPT presented wrong facts.

The lawyer used ChatGPT

According to a BBC report, ChatGPT gave wrong information to a lawyer in New York. The lawyer used ChatGPT for legal research and after relying on ChatGPT, is facing trouble in court hearings. The court found that the legal cases referred to by the lawyer and his firm in an ongoing case never existed. The judge described the incident as an “unprecedented circumstance”.

The lawyer had to be embarrassed in the court

This research was given to a lawyer named Loduka by his colleague Schwartz. Schwartz has been practicing law for more than 30 years. He used ChatGPT to find old cases related to the case. When Schwartz was questioned about this, he said that he was given wrong information by the AI ​​tool and he was not aware that AI could give wrong information. Furthermore, in a written statement, Schwartz also clarified that Loducka was not aware of how the research was conducted and was not a part of it in any way.

Be careful while using AI

Tech experts have argued that chatbots can sometimes create confusion. It may show you wrong information. Relying on AI, many users are accepting wrong facts as truth. However, we need to fact check ourselves. In this way, ChatGPT can be used to spread misinformation, intentionally or unintentionally.

News Reels

Read this also – How to dry hands then? Hand dryer installed in the washroom of office or mall is spreading bacteria, research revealed

[ad_2]

Leave a Comment