Comments (4)
try pip install pytorch-lightning==0.9.0
from prefixtuning.
@Jamesswang, according to the environment.yml
on the master branch, you should be right to use pytorch-lightning==0.8.5
.
If you setup a clean environment from this file, e.g. conda env create -f environment.yml
, you should avoid dependency issues. That said, I had to remove the following two lines when setting up the environment:
- bluert==0.0.1
- en-core-web-sm==2.3.1
from prefixtuning.
@tannonk Thank you for your reply
When I was running the code, I encountered the following problems after an epoch
Traceback (most recent call last):
File "finetune.py", line 876, in
main(args)
File "finetune.py", line 784, in main
logger=logger,
File "/home/wanghaotian/PrefixTuning/seq2seq/lightning_base.py", line 795, in generic_train
trainer.fit(model)
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/trainer/trainer.py", line 1003, in fit
results = self.single_gpu_train(model)
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/trainer/distrib_parts.py", line 186, in single_gpu_train
results = self.run_pretrain_routine(model)
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/trainer/trainer.py", line 1213, in run_pretrain_routine
self.train()
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/trainer/training_loop.py", line 370, in train
self.run_training_epoch()
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/trainer/training_loop.py", line 470, in run_training_epoch
self.run_evaluation(test_mode=False)
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/trainer/evaluation_loop.py", line 430, in run_evaluation
self.on_validation_end()
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/trainer/callback_hook.py", line 112, in on_validation_end
callback.on_validation_end(self, self.get_model())
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/utilities/distributed.py", line 12, in wrapped_fn
return fn(*args, **kwargs)
File "/home/wanghaotian/miniconda3/envs/prefix-tuning/lib/python3.6/site-packages/pytorch_lightning/callbacks/model_checkpoint.py", line 318, in on_validation_end
self._save_model(filepath)
TypeError: _save_model() missing 2 required positional arguments: 'trainer' and 'pl_module'
Have you ever encountered such a problem when running code
from prefixtuning.
Yes, I ran into the same error actually and haven't managed to solve that one yet. I'd open a new issue for that...
from prefixtuning.
Related Issues (20)
- Understanding the Seq2Seq Encoder-Decoder Prefix Implementation HOT 11
- Use --init_shallow_word for seq2seq model HOT 1
- Hi, What do function names get_prompt_p3,5,7 ... mean?
- AttributeError: 'tuple' object has no attribute 'detach'
- Should've mentioned about "CRITICAL" modifications done in transformers source code
- notation typo in the paper
- question about the initialization experiment
- Is it necessary to arrange position ids between [prefix_len, prefix_len+seq_len) ? HOT 2
- control code is not used in PrefixTuning.get_prompt() HOT 3
- Data preparation step HOT 1
- Which version of the Transformers did you modify? v3.2.0? HOT 4
- How to do inference with prefix tuning?
- Question: Do we need a supervised dataset with X & Y pairs?
- This is a relatively bad project because some strange packages were used
- License
- Data issue and file location query HOT 1
- which paper is the first one to propose soft prompt
- Need to update many things in the repository.
- 这个代码有没有集成到流行的代码仓里面啊
- The paper mentions the ability to train multiple prefixes in a single batch.
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from prefixtuning.