Comments (4)
Hey can you try setting "language": None
. language
is only used when stacking a task adapter (such as NER) on a pre-trained language adapter (such es en
). You would need to additionally load a language adapter such as en/wiki@ukp
to use this flag with e.g. bert-base-multilingual-cased
.
Thanks for pointing this out, this is a bug and we will fix it!
from adapters.
Hi @JoPfeiff ,
thanks for that hint - it's working now! I made some experiments with the WNUT-17 dataset (not on CoNLL-2003 as originally stated in the issue) to compare against a fully fine-tuned model:
Model | Run 1 | Run 2 | Run 3 | Run 4 | Run 5 | Avg. |
---|---|---|---|---|---|---|
Fully fine-tuned | (60.60) / 48.65 | (58.58) / 47.79 | (58.24) / 48.48 | (61.10) / 48.06 | (58.47) / 48.34 | (59.40) / 48.26 |
Adapter (Pfeiffer) | (57.81) / 46.36 | (58.31) / 45.89 | (58.31) / 46.78 | (55.78) / 46.51 | (57.75) / 46.85 | (57.59) / 46.48 |
There's currently a performance diff of 1.78%, so do you have any hints for boosting performance 🤔 Thanks ❤️
from adapters.
We have found that due to the newly introduced parameters, adapters require a higher learning rate. A default lr=0.0001
seems to work well in most cases. Also, depending on the dataset size you might need to train longer than usual.
We usually trained for up to 30 epochs and stored the best model based on the results on the dev set.
You can also try out the houlsby
architecture or increase the bottleneck_size
(currently set with reduction_factor
).
We will soon add a small guide on how to train adapters where these types of tipps will be included :)
from adapters.
from adapters.
Related Issues (20)
- Logits are changing in old adapters-transformer models if used by the new library HOT 1
- Transformers 4.35.0 Snyk Vulnerability HOT 1
- Error when trying to do torch.save HOT 2
- multi GPU setup causes error HOT 1
- Seed for Adapter Initialization?
- export onnx HOT 1
- How to add an adapter to a quantized model without peft? HOT 1
- load adapters from safetensor files (enhancement) HOT 2
- Cannot automatically convert prediction head of model class RobertaAdapterModel to flex head. HOT 2
- Adding adapters and fusion layer to UNIPELT
- Error when trying to set up adapter training for CodeLlama HOT 1
- T5 AdapterDrop Prefix-Tuning Bug HOT 2
- No difference in performance or speed between different adapter configs
- Total Parameter added to original model HOT 1
- UNIPELT original paper use alpha = 2 for Lora while the UNIPELT implementation in adapter has a alpha = 8 HOT 1
- regression head? HOT 4
- Custom Heads not working with adapters
- UNIPELT, UNIPELT (AP) and UNIPELT (APL)
- QuestionAnsweringTrainer for adapter? HOT 1
- "`.to` is not supported for `4-bit` or `8-bit` bitsandbytes models" when i use load_best_model_at_end=True in QLoRa HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from adapters.