Worthwhile yet tricky. Companies like OpenAI, Google, Meta, etc are full of experts in statistics and they have access to a lot of storage space. If use a service from those companies, say 4hrs per day between 7am and 9pm, at a certain frequency, e.g. 10 requests / hour, then suddenly, when you realize you actually do not trust them with your data, you do 10000 req/hr for 1hr then that’s a suspect pattern. Then might be able to rollback until before that “freak” event automatically. They might still present you as a user your data with the changes but not in their internal databases.
So… I’m not saying it’s not a good idea, nor useful, but I bet doing it properly is hard. It’s probably MUCH harder than do a GDPR (or equivalent) take out request then deletion request AND avoiding all services that might leverage your data from these providers.




Freedom to be exploited or exploit others even harder for “success”.
Sarcasm aside there are state equivalents, e.g. CCPA.