Giter Site home page Giter Site logo

Comments (7)

jhyuklee avatar jhyuklee commented on May 27, 2024 6

Hi @phosseini,
we are working on the custom vocabulary with BERT large version. It might take some time (maybe a couple of months) to find good pre-training steps.

Thanks.

from biobert-pretrained.

jhyuklee avatar jhyuklee commented on May 27, 2024 5

Yes, the WordPiece vocab is exactly the same as the original BERT for several reasons. First, we wanted to use pre-trained BERT released by Google which makes us to use the same WordPiece vocab. Second, because the WordPiece vocab is based on subword units, any new words in biomedical corpus could be turned into proper embeddings (might be tuned during fine-tuning). We could try building our own vocabs using biomedical corpora, but that would lose compatibility with the original pre-trained BERT.

from biobert-pretrained.

lucky-bai avatar lucky-bai commented on May 27, 2024 4

Hi @jhyuklee:

A random idea I had: would it be possible to use a custom vocabulary without redoing the BERT pretraining? One way to transfer the model onto a different vocabulary might proceed as follows:

  1. Train a new Wordpiece vocab.
  2. Use a similar technique as model distillation, to learn to replicate the same intermediate representation as the existing BERT, but using the new tokenizer. In other words, minimize the MSE between the old BERT after the first layer and the new BERT after the first layer. This paper uses a similar technique.
  3. Do this for a number of iterations, only training the first layer while keeping all other layers fixed.

The benefit of this is to avoid most of the expensive BERT pre-training: only the first layer would be trained from scratch, rather than the whole model. Thoughts?

from biobert-pretrained.

joelkuiper avatar joelkuiper commented on May 27, 2024 1

Got it! Thanks for the quick and helpful reply 👍

I can understand why keeping compatibility with the original BERT is important. Personally, I would like to have a custom dictionary, since I think there might be some interesting opportunity for fine tuning as a lot of medical jargon (like drug names and chemicals) have somewhat of a unique internal structure that is now lost during the subword tokenization. But it'd be rude to ask you to train that! Feel free to close, and thank you for this great contribution!

from biobert-pretrained.

jhyuklee avatar jhyuklee commented on May 27, 2024 1

We have a plan for using a custom dictionary, but it will require much more GPU hours to pre-train such model compared to starting from the pre-trained BERT. We'll share it if it works. Thank you for your interest, and I'll close the issue.

from biobert-pretrained.

phosseini avatar phosseini commented on May 27, 2024 1

We have a plan for using a custom dictionary, but it will require much more GPU hours to pre-train such model compared to starting from the pre-trained BERT. We'll share it if it works. Thank you for your interest, and I'll close the issue.

I wonder if there's any update on using the custom dictionary and if it's a work in progress or on your TODO list?

from biobert-pretrained.

sebpretzer avatar sebpretzer commented on May 27, 2024

Hi @jhyuklee,
I saw you updated your weights with a custom vocabulary. I was wondering if you had any information on how you trained that model. Anything along the lines of:

  1. How long did it take to train your model?
  2. I assume you still used the 8xV100 machine for training?
  3. Did you use the original BERT dataset in your pre-training (wikipedia + bookcorpus)?

Thank you!

from biobert-pretrained.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.