OpenAI CEO Sam Altman has announced a set of changes designed to significantly alter the way ChatGPT communicates with individuals under the age of 18. He declares that the safety of teenagers is a priority. The company is no longer just a technology provider. Now, it acts as a digital guardian that can recognize threats and intervene if necessary.
No more flirting or crisis-related content
The rules cover two areas particularly vulnerable to abuse: intimate relationships and mental health crises. ChatGPT will not engage in flirting with underage users. New mechanisms are also in place to block the creation of narratives or scenarios involving health or life loss. When the system detects serious warning signals, it can inform a caregiver — and in extreme cases, emergency services, such as the police.