12 Nov 2025, Wed


OpenAI recently released a report, in which it was told that millions of users are talking to ChatGPT about mental health related problems. According to the report, every week about 0.15% users share dangerous thoughts like suicide in chat. Out of the approximately 80 crore weekly active users of ChatGPT, this number is considered quite large. Many users also feel an emotional connection with the chatbot, which may be a sign of mental instability.

OpenAI said that ChatGPT has been trained with the advice of more than 170 experts to give accurate and sensitive answers to questions related to mental health. The new model gives the correct answer in 91% of the cases compared to the old one, whereas earlier this figure was 77%. The company says that now ChatGPT maintains security even during long conversations.

A case was registered after the death of a 16 year old boy.

Recently, a case has been filed against OpenAI after the suicide of a 16-year-old boy, who had shared his thoughts with ChatGPT before dying. After this, the American states California and Delaware warned the company that it would have to ensure the safety of the users. For this reason, OpenAI has now added features like parental control and identification of minors.

Big revelation in research

Research has also found that chatbots can sometimes negatively impact mentally weak users. However, OpenAI CEO Sam Altman claims that the company is developing ChatGPT in such a way that it proves helpful in mental health, but experts say that these security features are currently available only to paid users.

Read this also-

‘Afghanistan has become a puppet…’, Pakistan’s Defense Minister Khawaja Asif threatened to take revenge, made these serious allegations against India

Source link

By Admin

Leave a Reply

Your email address will not be published. Required fields are marked *