#ai
#computing
In [[Computing/Intelligence/Machine Learning/Machine Learning]] we build "intelligence" using some data + some algorithm and tend to make bigger models with more data.
It's common to then "edit" a bit these models using a little bit of other data for a specific task.
For example, editing a [[GPT3]] (we say fine-tuning), for talking about cats, you give it plenty of cats text data.
What if instead we removed from the model knowledge unrelated to cats?
**Less is more**, [[Via Negativa|inverse model]].
Possibly the path to this: https://arxiv.org/pdf/2104.08696.pdf