Friday, 29 August

Friday, 29 August2025

Anthropic Will Train Claude on Your Chats by Default

Anthropic Will Train Claude on Your Chats by Default
Anthropic has updated its policy: as of September 28, Claude users on Free, Pro, and Max plans must actively opt out if they do not want their new or resumed chats and coding sessions used to train AI models. Opting in triggers a five-year data retention, while opting out retains standard 30-day deletion. The interface features a prominentAcceptbutton with a subtler toggle for training permissionsraising privacy and consent concerns.
Read full story at TechCrunch

Subscribe To Our Newsletter.

Full Name
Email