Comments (3)
Hi @xplip,
Either way it would be good to know if the pytorch_model_head.bin in the NER task adapter's directory is the same as the one in the language adapter's directory
As you're using a model that only supports one fixed head (BertForTokenClassification
etc.), the weights of the head saved in the folder of the language adapter and those saved in the folder of the task adapter should be identical in every case, so it shouldn't make a difference from where they are loaded.
The script loads the prediction head weights from the language adapter's pytorch_model_head.bin whereas the task adapter's pytorch_model_head.bin is not used
That's actually a bug, since prediction heads of task adapters certainly should also be loaded if available. :)
from adapters.
Thinking about it, it would probably not make sense to copy the pytorch_model_head.bin
. Rather I would fine-tune again on the new adapter and use the head generated from that finetuning procedure.
Either way it would be good to know if the pytorch_model_head.bin
in the NER task adapter's directory is the same as the one in the language adapter's directory
from adapters.
Sorry for the comment mess. 😄 I guess what's mainly caused my confusion was that the language adapter is not actually being finetuned (the weights are fixed here), whereas the NER adapter is actually being finetuned, but then the prediction head is not loaded from the NER adapter directory (regardless of whether that makes a difference).
from adapters.
Related Issues (20)
- Logits are changing in old adapters-transformer models if used by the new library HOT 1
- Transformers 4.35.0 Snyk Vulnerability HOT 1
- Error when trying to do torch.save HOT 2
- multi GPU setup causes error HOT 1
- Seed for Adapter Initialization?
- export onnx HOT 1
- How to add an adapter to a quantized model without peft? HOT 1
- load adapters from safetensor files (enhancement) HOT 2
- Cannot automatically convert prediction head of model class RobertaAdapterModel to flex head. HOT 2
- Adding adapters and fusion layer to UNIPELT
- Error when trying to set up adapter training for CodeLlama HOT 1
- T5 AdapterDrop Prefix-Tuning Bug HOT 2
- No difference in performance or speed between different adapter configs
- Total Parameter added to original model HOT 1
- UNIPELT original paper use alpha = 2 for Lora while the UNIPELT implementation in adapter has a alpha = 8 HOT 1
- regression head? HOT 4
- Custom Heads not working with adapters
- UNIPELT, UNIPELT (AP) and UNIPELT (APL)
- QuestionAnsweringTrainer for adapter? HOT 1
- "`.to` is not supported for `4-bit` or `8-bit` bitsandbytes models" when i use load_best_model_at_end=True in QLoRa HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from adapters.