I’m rather curious to see how the EU’s privacy laws are going to handle this.
(Original article is from Fortune, but Yahoo Finance doesn’t have a paywall)
You must log in or register to comment.
“AI model unlearning” is the equivalent of saying “removing a specific feature from a compiled binary executable”. So, yeah, basically not feasible.
But the solution is painfully easy: you remove the data from your training set (ie, the source code), and re-train your model (recompile the executable).
Yes, it may cost you a lot of time and money to accomplish this, but such are the consequences of breaking the law. Maybe be extra careful about obeying laws going forward, eh?
Just kill ot off and start from the beginning.