Giter Site home page Giter Site logo

Comments (9)

danieldk avatar danieldk commented on May 22, 2024

Could you give the backtrace of where this error occurs or instructions how to reproduce this error?

Some background: prior to spaCy 3.7, the our pretrained pipelines used spacy-transformers, starting from 3.7.0 they use spacy-curated-transformers. However, both types of pipelines work with spaCy 3.7.0. This error seems to come from code that tries to use the last_hidden_layer_state property, which is only available in spacy-curated-transformers on transformer output that comes from spacy-transformers.

However, it is hard to say what code is error comes from without a backtrace of the exception.

from spacy.

srikanth-t20 avatar srikanth-t20 commented on May 22, 2024

Sure. Here it is:

Error type: <class 'AttributeError'>;
Error value: 'TransformerData' object has no attribute 'last_hidden_layer_state';
Error Trace: [<FrameSummary file ../../TrainSpacyModel.py, line 56 in trainSpacyModel>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/cli/train.py, line 84 in train>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/training/loop.py, line 135 in train>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/training/loop.py, line 118 in train>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/training/loop.py, line 243 in train_while_improving>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/training/loop.py, line 298 in evaluate>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/language.py, line 1459 in evaluate>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/language.py, line 1618 in pipe>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/util.py, line 1685 in _pipe>, <FrameSummary file spacy/pipeline/pipe.pyx, line 55 in pipe>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/util.py, line 1685 in _pipe>, <FrameSummary file spacy/pipeline/transition_parser.pyx, line 245 in pipe>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/util.py, line 1632 in minibatch>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/util.py, line 1685 in _pipe>, <FrameSummary file spacy/pipeline/transition_parser.pyx, line 245 in pipe>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/util.py, line 1632 in minibatch>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/util.py, line 1685 in _pipe>, <FrameSummary file spacy/pipeline/trainable_pipe.pyx, line 79 in pipe>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy/util.py, line 1704 in raise_error>, <FrameSummary file spacy/pipeline/trainable_pipe.pyx, line 75 in spacy.pipeline.trainable_pipe.TrainablePipe.pipe>, <FrameSummary file spacy/pipeline/tagger.pyx, line 138 in spacy.pipeline.tagger.Tagger.predict>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/thinc/model.py, line 334 in predict>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/thinc/layers/chain.py, line 54 in forward>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/thinc/model.py, line 310 in call>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy_curated_transformers/models/listeners.py, line 415 in last_transformer_layer_listener_forward>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/thinc/model.py, line 334 in predict>, <FrameSummary file /root/anaconda3/envs/spacy_bert_pt/lib/python3.7/site-packages/spacy_curated_transformers/models/pooling.py, line 103 in with_ragged_last_layer_forward>]

from spacy.

danieldk avatar danieldk commented on May 22, 2024

Thank you, could you give more information about the model configuration and the TrainSpacyModel.py program that is in the backtrace?

What seems to be happening here is that the model is using a transformer from spacy-transformers, so a transformer pipe that is configured like this:

[components.transformer]
factory = "transformer"
# ...

But that the parser in the pipeline is configured with a listener from spacy-curated-transformers, e.g.:

[components.parser.model.tok2vec]
@architectures = "spacy-curated-transformers.LastTransformerLayerListener.v1"
width = 768
pooling = {"@layers":"reduce_mean.v1"}
upstream = "*"

However, that doesn't work. If the transformer is from spacy-transformers, then the listener should be from the same package. E.g.:

[components.parser.model.tok2vec]
@architectures = "spacy-transformers.TransformerListener.v1"
grad_factor = 1.0

[components.parser.model.tok2vec.pooling]
@layers = "reduce_mean.v1"

from spacy.

srikanth-t20 avatar srikanth-t20 commented on May 22, 2024

It seems to be the case.

Question 1: In the section [components.transformer], what is the value for "factory" key that should be set for spacy-curated-transformer? Is it "curated-transformer"?

Question 2: In the section [components.parser], we've set the key "source" to the transformer model (en_core_web_trf). Shouldn't it use the transformer-listener automatically?

PL find below the config file sections:

[nlp]
lang = "en"
pipeline = ["transformer","tagger","parser","ner","entity_ruler"]
batch_size = 128
disabled = []
before_creation = null
after_creation = null
after_pipeline_creation = null
tokenizer = {"@Tokenizers":"spacy.Tokenizer.v1"}
vectors = {"@vectors":"spacy.Vectors.v1"}

[components]
[components.entity_ruler]
factory = "entity_ruler"
ent_id_sep = "||"
matcher_fuzzy_compare = {"@misc":"spacy.levenshtein_compare.v1"}
overwrite_ents = "True"
phrase_matcher_attr = null
scorer = {"@scorers":"spacy.entity_ruler_scorer.v1"}
validate = false

[components.ner]
factory = "ner"
incorrect_spans_key = null
moves = null
scorer = {"@scorers":"spacy.ner_scorer.v1"}
update_with_oracle_cut_size = 100

[components.ner.model]
@architectures = "spacy.TransitionBasedParser.v2"
state_type = "ner"
extra_state_tokens = false
hidden_width = 64
maxout_pieces = 2
use_upper = false
nO = null

[components.ner.model.tok2vec]
@architectures = "spacy-transformers.TransformerListener.v1"
grad_factor = 1.0
pooling = {"@layers":"reduce_mean.v1"}
upstream = "*"

[components.parser]
source = "en_core_web_trf"

[components.tagger]
source = "en_core_web_trf"

[components.transformer]
factory = "transformer"
max_batch_items = 4096
set_extra_annotations = {"@annotation_setters":"spacy-transformers.null_annotation_setter.v1"}

[components.transformer.model]
@architectures = "spacy-transformers.TransformerModel.v3"
name = "bert-base-uncased"
mixed_precision = false

[components.transformer.model.get_spans]
@span_getters = "spacy-transformers.strided_spans.v1"
window = 128
stride = 96

from spacy.

danieldk avatar danieldk commented on May 22, 2024

Question 1: In the section [components.transformer], what is the value for "factory" key that should be set for spacy-curated-transformer? Is it "curated-transformer"?

Yes indeed. But the curated-transformer, but you also have to changed the model that is used to the correct architecture. If you use bert-base-uncased that would be spacy-curated-transformers.BertTransformer.v1.

[components]

[components.transformer]
factory = "curated_transformer"

[components.transformer.model]
@architectures = "spacy-curated-transformers.BertTransformer.v1"

You can then fill the rest of the transformer and initialization configuration with init fill-curated-transformer, specifying the model name (bert-base-uncased) through the --model-name argument.

Question 2: In the section [components.parser], we've set the key "source" to the transformer model (en_core_web_trf). Shouldn't it use the transformer-listener automatically?

There are two issues here:

  1. The parser component of en_core_web_trf uses LastTransformerLayerListener as its tok2vec model. So that's what gets copied over when you are sourcing the en_core_web_trf model.
  2. When you change components.transformer to use curated_transformer, the parser listener will work. However it does not have the effect that you probably want, because it will use the new transformer component and not the transformer component from the en_core_web_trf pipe. So you'd have to be finetune the transformer from scratch to work with the parser and tagger components.

Could you give more information of what you are trying to accomplish? If your goal is to add an entity ruler to the pipeline, it will be much easier to take the existing en_core_web_trf model and add the entity ruler to that.

from spacy.

srikanth-t20 avatar srikanth-t20 commented on May 22, 2024

Thank you for the update Daniel.

We are planning to build a custom NER along with entity ruler using pre-trained transformer model.

Initially, there was an issue whose resolution implied that tagger and attribute_ruler should be added to the pipeline for using attribute POS while building entity rules. We expected the tagger to pick the annotations from the pre-trained transformer. Hence set the source accordingly and included them in the frozen components list.

Coming back to the question that you raised, there wasn't any need for us to use curated-transformer. It was working fine pre-v3.7.2. After upgrading, it is trying to look for a non-existing member as mentioned in the issue description.

It seems that it is automatically loading the curated-transformer without any explicit setting it in any of our config sections.

As suggested by you, I'll call the init fill-curated-transformer to generate a full config.cfg and see the complete list of changes made to the file in v3.7.2.

Finally, can you please suggest a good documentation source that explains all the sections and their respective key/value pairs that are populated in the config.cfg file?

Thanks again for your help.

from spacy.

danieldk avatar danieldk commented on May 22, 2024

The main config sections are described here:

https://spacy.io/api/data-formats#config

spacy-transformers:

https://spacy.io/api/transformer#config
https://spacy.io/api/architectures#TransformerModel

curated-transformers:

https://spacy.io/api/curatedtransformer#config

We also have example configs/projects for transformers:

https://github.com/explosion/spacy-transformers/tree/master/examples/configs
https://github.com/explosion/spacy-curated-transformers/tree/main/project

from spacy.

danieldk avatar danieldk commented on May 22, 2024

I'll close this issue, because it is about model configuration and not a bug in spaCy itself. Feel free to open a discussion thread if you have follow-up questions.

from spacy.

github-actions avatar github-actions commented on May 22, 2024

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

from spacy.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.