News of the World

OpenAI Restricts ChatGPT: Ban on Medical, Legal, and Financial Advice

OpenAI has officially introduced new restrictions for ChatGPT. Starting October 29, the service is prohibited from giving specific recommendations related to medicine, law, and finance. The model can now only explain principles, describe mechanisms, and direct users to qualified professionals.
Nov 3, 2025 - 14:08
 0  27
Photo taken from public sources

According to the updated policy, ChatGPT will no longer be able to mention specific drug names, indicate dosages, or propose treatment plans. In the legal field, the model is forbidden from drafting legal templates, claims, or offering advice on court strategies. Similar restrictions apply to finance — ChatGPT will not recommend investments, assess asset profitability, or suggest what to buy or sell.

OpenAI representatives stated that the changes are aimed at improving user safety and preventing potential errors that could lead to legal, financial, or medical risks. ChatGPT is now limited to the role of an informational assistant, capable of explaining general principles of systems, laws, and tools without providing personalized guidance.

Before the restrictions were introduced, users actively employed ChatGPT for analyzing legal acts, drafting legal documents, and evaluating investment strategies. After the update, these functions will only be available in a descriptive format — without concrete recommendations or forecasts.

The new policy applies to all ChatGPT versions, including the paid GPT-5 and enterprise solutions. OpenAI emphasized that the company’s top priority remains “the responsible use of artificial intelligence” and minimizing the risk of misuse in critical domains.