OpenAI Restricts ChatGPT: Ban on Medical, Legal, and Financial Advice
According to the updated policy, ChatGPT will no longer be able to mention specific drug names, indicate dosages, or propose treatment plans. In the legal field, the model is forbidden from drafting legal templates, claims, or offering advice on court strategies. Similar restrictions apply to finance — ChatGPT will not recommend investments, assess asset profitability, or suggest what to buy or sell.
OpenAI representatives stated that the changes are aimed at improving user safety and preventing potential errors that could lead to legal, financial, or medical risks. ChatGPT is now limited to the role of an informational assistant, capable of explaining general principles of systems, laws, and tools without providing personalized guidance.
Before the restrictions were introduced, users actively employed ChatGPT for analyzing legal acts, drafting legal documents, and evaluating investment strategies. After the update, these functions will only be available in a descriptive format — without concrete recommendations or forecasts.
The new policy applies to all ChatGPT versions, including the paid GPT-5 and enterprise solutions. OpenAI emphasized that the company’s top priority remains “the responsible use of artificial intelligence” and minimizing the risk of misuse in critical domains.