Giter Site home page Giter Site logo

windchimeran / copymtl Goto Github PK

View Code? Open in Web Editor NEW
122.0 2.0 23.0 3.52 MB

AAAI20 "CopyMTL: Copy Mechanism for Joint Extraction of Entities and Relations with Multi-Task Learning"

Python 99.45% Shell 0.55%
acl pytorch paper copynet relation-extraction aaai2020 aaai nyt

copymtl's People

Contributors

windchimeran avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

copymtl's Issues

可抽取三元组的数量?

你好!我认为这篇论文中提出的方法,无论是One-Decoder还是Mul-Decoder都可以抽取任意数量的三元组。可是您在Conclusions部分提出本模型只能抽取固定数量的三元组。是我哪里理解的有问题吗?

模型评价时也仅考虑实体的“最后一个词”吗?

非常感谢您的工作。该工作可以解决实体词的多词问题,但为什么数据处理和copyRE好像是一样的,即仅仅保留了实体词的最后一个词?这不就违背了论文的初衷吗??

目前看处理之后的数据都仅仅记录了最后一个词的位置而没有记录初始位置,训练集使用这个没有问题,但测试和开发也这么处理会不会就有问题了?

What defines the order of the relation triples?

Hello,

I was wondering what defines the order in which the decoder of the CopyMTL model outputs the (relation, head, tail) triples.
How did you order the triples in the training sentence's ground truth?

how to get the id.json

i have the original data.
what should i do to process the original dataset(string) to id number?

How many epochs did you train on webnlg dataset?

The log info of training model on webnlg. It's tough for loss to decline and the training starts from a high loss value than the log file you have shown.
'''
config filename: ./config.json
load_embedding!
loading ./data/webnlg/entity_end_position/train.json
data size 5019
loading ./data/webnlg/entity_end_position/dev.json
data size 703
epoch 1 loss: 32.915226 F1: 0.002298 P: 0.002868 R: 0.001917
epoch 2 loss: 31.900913 F1: 0.009905 P: 0.012264 R: 0.008307
epoch 3 loss: 27.510458 F1: 0.015719 P: 0.018970 R: 0.013419
epoch 4 loss: 28.100975 F1: 0.025424 P: 0.032008 R: 0.021086
epoch 5 loss: 26.887936 F1: 0.050152 P: 0.061856 R: 0.042173
epoch 6 loss: 24.247564 F1: 0.065084 P: 0.081184 R: 0.054313
epoch 7 loss: 24.565767 F1: 0.071375 P: 0.086600 R: 0.060703
epoch 8 loss: 25.273928 F1: 0.078818 P: 0.096834 R: 0.066454
epoch 9 loss: 20.739996 F1: 0.080451 P: 0.097717 R: 0.068371
epoch 10 loss: 23.133522 F1: 0.081061 P: 0.099535 R: 0.068371
epoch 11 loss: 23.140676 F1: 0.083801 P: 0.101083 R: 0.071565
epoch 12 loss: 23.790190 F1: 0.078331 P: 0.091688 R: 0.068371
epoch 13 loss: 25.107319 F1: 0.099564 P: 0.115417 R: 0.087540
epoch 14 loss: 23.819378 F1: 0.090511 P: 0.105532 R: 0.079233
epoch 15 loss: 22.022694 F1: 0.086194 P: 0.100597 R: 0.075399
epoch 16 loss: 23.545181 F1: 0.090909 P: 0.105485 R: 0.079872
epoch 17 loss: 22.910625 F1: 0.098246 P: 0.108949 R: 0.089457
epoch 18 loss: 19.319550 F1: 0.115789 P: 0.128405 R: 0.105431
epoch 19 loss: 21.152763 F1: 0.099858 P: 0.111994 R: 0.090096
epoch 20 loss: 21.946230 F1: 0.095470 P: 0.104981 R: 0.087540
epoch 21 loss: 19.299545 F1: 0.102300 P: 0.110534 R: 0.095208
epoch 22 loss: 23.689260 F1: 0.097695 P: 0.105812 R: 0.090735
epoch 23 loss: 23.338881 F1: 0.091374 P: 0.097953 R: 0.085623
epoch 24 loss: 21.140081 F1: 0.099003 P: 0.107143 R: 0.092013
epoch 25 loss: 20.936872 F1: 0.100138 P: 0.108941 R: 0.092652
epoch 26 loss: 18.399645 F1: 0.103602 P: 0.111852 R: 0.096486
epoch 27 loss: 21.461807 F1: 0.101880 P: 0.109559 R: 0.095208
epoch 28 loss: 21.085964 F1: 0.098673 P: 0.105531 R: 0.092652
epoch 29 loss: 21.767021 F1: 0.100812 P: 0.107117 R: 0.095208
epoch 30 loss: 21.892426 F1: 0.102096 P: 0.108399 R: 0.096486
epoch 31 loss: 21.119205 F1: 0.089643 P: 0.095652 R: 0.084345
epoch 32 loss: 20.188158 F1: 0.105983 P: 0.113971 R: 0.099042
epoch 33 loss: 22.146685 F1: 0.105727 P: 0.112554 R: 0.099681
epoch 34 loss: 22.138407 F1: 0.093960 P: 0.098940 R: 0.089457
epoch 35 loss: 21.843102 F1: 0.096566 P: 0.103198 R: 0.090735
epoch 36 loss: 19.749903 F1: 0.094840 P: 0.104375 R: 0.086901
epoch 37 loss: 20.443235 F1: 0.099729 P: 0.106291 R: 0.093930
epoch 38 loss: 19.464769 F1: 0.099865 P: 0.105790 R: 0.094569
epoch 39 loss: 22.916927 F1: 0.104223 P: 0.111597 R: 0.097764
epoch 40 loss: 22.332972 F1: 0.096533 P: 0.103123 R: 0.090735
epoch 41 loss: 21.962793 F1: 0.094883 P: 0.101010 R: 0.089457
epoch 42 loss: 22.766172 F1: 0.095174 P: 0.100858 R: 0.090096
epoch 43 loss: 22.344751 F1: 0.097002 P: 0.102564 R: 0.092013
epoch 44 loss: 23.941555 F1: 0.099529 P: 0.105039 R: 0.094569
epoch 45 loss: 22.505873 F1: 0.100633 P: 0.105153 R: 0.096486
epoch 46 loss: 20.527216 F1: 0.105898 P: 0.111346 R: 0.100958
epoch 47 loss: 19.862801 F1: 0.098459 P: 0.103448 R: 0.093930
epoch 48 loss: 20.391645 F1: 0.101604 P: 0.106517 R: 0.097125
epoch 49 loss: 20.863894 F1: 0.104035 P: 0.108787 R: 0.099681
epoch 50 loss: 21.055967 F1: 0.103010 P: 0.108070 R: 0.098403
epoch 51 loss: 20.388582 F1: 0.099599 P: 0.104415 R: 0.095208
epoch 52 loss: 22.958044 F1: 0.101469 P: 0.106219 R: 0.097125
epoch 53 loss: 20.523462 F1: 0.102513 P: 0.107746 R: 0.097764
epoch 54 loss: 20.575268 F1: 0.098732 P: 0.103280 R: 0.094569
epoch 55 loss: 22.237806 F1: 0.101536 P: 0.106368 R: 0.097125
epoch 56 loss: 20.335493 F1: 0.106525 P: 0.111188 R: 0.102236
epoch 57 loss: 20.177532 F1: 0.095365 P: 0.099721 R: 0.091374
epoch 58 loss: 18.787577 F1: 0.094126 P: 0.098532 R: 0.090096
epoch 59 loss: 21.285160 F1: 0.099300 P: 0.103760 R: 0.095208
epoch 60 loss: 21.078987 F1: 0.101672 P: 0.106667 R: 0.097125
epoch 61 loss: 21.739445 F1: 0.098262 P: 0.103013 R: 0.093930
epoch 62 loss: 17.307463 F1: 0.101604 P: 0.106517 R: 0.097125
epoch 63 loss: 20.719521 F1: 0.097724 P: 0.102600 R: 0.093291
epoch 64 loss: 23.640509 F1: 0.106000 P: 0.110801 R: 0.101597
epoch 65 loss: 21.545544 F1: 0.099967 P: 0.104457 R: 0.095847
epoch 66 loss: 19.790474 F1: 0.098997 P: 0.103860 R: 0.094569
epoch 67 loss: 19.295521 F1: 0.104388 P: 0.108801 R: 0.100319
epoch 68 loss: 19.242487 F1: 0.102838 P: 0.107692 R: 0.098403
epoch 69 loss: 21.927759 F1: 0.102632 P: 0.107242 R: 0.098403
epoch 70 loss: 21.096117 F1: 0.097804 P: 0.102012 R: 0.093930
epoch 71 loss: 19.201813 F1: 0.101358 P: 0.105227 R: 0.097764
epoch 72 loss: 20.965561 F1: 0.099933 P: 0.104384 R: 0.095847
epoch 73 loss: 22.830009 F1: 0.098274 P: 0.102281 R: 0.094569
epoch 74 loss: 21.505526 F1: 0.103540 P: 0.108467 R: 0.099042
epoch 75 loss: 19.773630 F1: 0.100399 P: 0.104643 R: 0.096486
epoch 76 loss: 20.365486 F1: 0.099536 P: 0.103520 R: 0.095847
epoch 77 loss: 21.803555 F1: 0.104914 P: 0.109191 R: 0.100958
epoch 78 loss: 19.608545 F1: 0.098525 P: 0.103594 R: 0.093930
epoch 79 loss: 24.478024 F1: 0.079640 P: 0.086924 R: 0.073482
epoch 80 loss: 24.325733 F1: 0.079002 P: 0.086298 R: 0.072843
epoch 81 loss: 21.051134 F1: 0.100105 P: 0.110681 R: 0.091374
epoch 82 loss: 20.675182 F1: 0.103472 P: 0.110706 R: 0.097125
epoch 83 loss: 20.217590 F1: 0.096589 P: 0.102436 R: 0.091374
epoch 84 loss: 23.057375 F1: 0.099596 P: 0.105188 R: 0.094569
epoch 85 loss: 22.329268 F1: 0.096625 P: 0.100206 R: 0.093291
epoch 86 loss: 25.108236 F1: 0.093677 P: 0.098315 R: 0.089457
epoch 87 loss: 20.028948 F1: 0.092852 P: 0.097271 R: 0.088818
epoch 88 loss: 23.089718 F1: 0.089692 P: 0.094167 R: 0.085623
epoch 89 loss: 18.960468 F1: 0.089692 P: 0.094167 R: 0.085623
epoch 90 loss: 22.726799 F1: 0.091122 P: 0.095775 R: 0.086901
epoch 91 loss: 22.099943 F1: 0.099967 P: 0.104457 R: 0.095847
epoch 92 loss: 23.679205 F1: 0.090576 P: 0.094576 R: 0.086901
epoch 93 loss: 22.687222 F1: 0.094220 P: 0.098739 R: 0.090096
epoch 94 loss: 23.637705 F1: 0.097967 P: 0.102368 R: 0.093930
epoch 95 loss: 19.864727 F1: 0.095270 P: 0.099513 R: 0.091374
epoch 96 loss: 25.225853 F1: 0.095143 P: 0.099237 R: 0.091374
epoch 97 loss: 20.020922 F1: 0.103758 P: 0.108183 R: 0.099681
epoch 98 loss: 23.542727 F1: 0.098700 P: 0.103208 R: 0.094569
epoch 99 loss: 22.641554 F1: 0.098065 P: 0.102582 R: 0.093930
epoch 100 loss: 21.837103 F1: 0.101367 P: 0.105997 R: 0.097125
epoch 101 loss: 18.833443 F1: 0.104035 P: 0.108787 R: 0.099681
epoch 102 loss: 21.309443 F1: 0.105263 P: 0.109951 R: 0.100958
epoch 103 loss: 19.390417 F1: 0.101661 P: 0.105882 R: 0.097764
epoch 104 loss: 19.361664 F1: 0.105894 P: 0.110570 R: 0.101597
epoch 105 loss: 20.413847 F1: 0.100633 P: 0.105153 R: 0.096486
epoch 106 loss: 20.095713 F1: 0.104423 P: 0.108877 R: 0.100319
epoch 107 loss: 21.468578 F1: 0.105718 P: 0.110187 R: 0.101597
epoch 108 loss: 20.518288 F1: 0.106870 P: 0.111188 R: 0.102875
epoch 109 loss: 20.153645 F1: 0.109079 P: 0.113731 R: 0.104792
epoch 110 loss: 21.636946 F1: 0.107119 P: 0.111728 R: 0.102875
epoch 111 loss: 18.999245 F1: 0.109854 P: 0.114663 R: 0.105431
epoch 112 loss: 22.469585 F1: 0.112255 P: 0.116874 R: 0.107987
epoch 113 loss: 22.612556 F1: 0.108306 P: 0.112803 R: 0.104153
epoch 114 loss: 21.570963 F1: 0.107534 P: 0.111878 R: 0.103514
epoch 115 loss: 20.076935 F1: 0.102853 P: 0.106970 R: 0.099042
epoch 116 loss: 19.920132 F1: 0.108970 P: 0.113495 R: 0.104792
epoch 117 loss: 21.485884 F1: 0.106136 P: 0.110345 R: 0.102236
epoch 118 loss: 22.931591 F1: 0.112292 P: 0.116955 R: 0.107987
epoch 119 loss: 24.095680 F1: 0.110446 P: 0.115198 R: 0.106070
epoch 120 loss: 20.244862 F1: 0.105859 P: 0.110493 R: 0.101597
epoch 121 loss: 20.845545 F1: 0.104775 P: 0.108890 R: 0.100958
epoch 122 loss: 19.785984 F1: 0.106799 P: 0.111034 R: 0.102875
epoch 123 loss: 17.929617 F1: 0.112330 P: 0.117036 R: 0.107987
epoch 124 loss: 20.878696 F1: 0.104949 P: 0.109267 R: 0.100958
epoch 125 loss: 20.286144 F1: 0.107190 P: 0.111883 R: 0.102875
epoch 126 loss: 21.894135 F1: 0.112330 P: 0.117036 R: 0.107987
epoch 127 loss: 22.584496 F1: 0.106916 P: 0.112045 R: 0.102236
epoch 128 loss: 21.797398 F1: 0.105123 P: 0.109646 R: 0.100958
epoch 129 loss: 23.331161 F1: 0.104354 P: 0.108726 R: 0.100319
epoch 130 loss: 23.423994 F1: 0.112397 P: 0.116438 R: 0.108626
epoch 131 loss: 21.362347 F1: 0.112844 P: 0.117403 R: 0.108626
epoch 132 loss: 22.722223 F1: 0.111406 P: 0.115782 R: 0.107348
epoch 133 loss: 19.547405 F1: 0.110116 P: 0.114483 R: 0.106070
epoch 134 loss: 21.386782 F1: 0.106101 P: 0.110269 R: 0.102236
epoch 135 loss: 19.975340 F1: 0.109333 P: 0.114286 R: 0.104792
epoch 136 loss: 18.630228 F1: 0.108825 P: 0.113182 R: 0.104792
'''

Dataset Problem

Hello!
I have some questions about this project.
Can you share the original dataset? My email is [email protected]. Thank you
Do you filter all sentences that contain only the relationship of None?

Thank you!

论文中NovelTagging和GraphRel的结果是只看最后一个token吗?

image
Hi,Nice Work。想请教一个问题,我看论文里只说了copyre是less strict,事实上这个表格的NovelTagging和GraphRel,也是只看最后一个token的吧?因为NovelTagging是从copyre粘过来的结果,然后GraphRel是从原文粘过来的(原文说了只看最后一个token),为什么这里只提CopyRE呢?

where is the loss of the sequence labeling?

Hello,
I can't find the loss of the sequence labeling in your code.
I can find only the loss for the decoder as follows:

    loss = 0
    for t in range(self.seq2seq.decoder.decodelen):
        loss = loss + self.loss(pred_logits_list[t], triplets[:, t])

Question about multi-token entity in the dataset

Hi Haoran, I notice that you directly use the preprocessed datasets released by CopyR [Zeng, 2018] in your experiments. In this repo, I checked their preprocessing scripts and downloaded the datasets. Then I find that they only use the last word to represent the entity in both training and test set. As a consequent, the preprocessed datasets contain no multi-token entity and the annotated triples are all composed of two tokens and one relation type.

The proposed CopyMTL can conceptually address the multi-token entities problem. However, I wonder whether the adopted dataset is suitable to be the testbed to validate this argument because it doesn't contain any multi-token entities. Could you please explain this issue a little further? Or am I missing anything? Thank you very much!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.