anhthyngo / c-bert Goto Github PK
View Code? Open in Web Editor NEWThis project forked from wh629/c-bert
Attention-based models such as BERT have produced state-of-the-art performance on many NLP tasks. However, these models suffer from catastrophic forgetting (CF) in a continual setting. We hypothesize that utilizing meta-learning methodologies such as OML to further train BERT will help learn contextual representations that are more robust to CF.