Giter Site home page Giter Site logo

Comments (10)

markgllin avatar markgllin commented on June 15, 2024

testing with Lancern's implementation, cosine similarity is consistently reported to be 0.99+, aligning with expectations. Will see if I can determine why I'm getting different results with asm2vec-pytorch

from asm2vec-pytorch.

oalieno avatar oalieno commented on June 15, 2024

Hi @markgllin, great question
scripts/compare.py also need to be trained, the default epoch is only 10.
The embedding is initially set to random vector, so it is reasonable that the cosine similarity score is nearly 0.
You can set the epochs higher to get a better result.

I also test around this case.
After setting a higher epoch, which is 2000, I get the cosine similarity score around 0.4.
Which is quite low compared to 0.99.
Need more digging.

from asm2vec-pytorch.

oalieno avatar oalieno commented on June 15, 2024

It seems that embedding vector initialization with make_small_ndarray is crucial.
Not sure why 🤔

from asm2vec-pytorch.

markgllin avatar markgllin commented on June 15, 2024

is make_small_ndarray a method from pytorch or numpy? A quick search returns 0 results in terms of documentation for this :S

from asm2vec-pytorch.

markgllin avatar markgllin commented on June 15, 2024

I also have a very minimal understanding of ML, but would testing with an epoch of 2000 not result in overfitting?

from asm2vec-pytorch.

oalieno avatar oalieno commented on June 15, 2024

@markgllin make_small_ndarray is the function implemented in Lancern's implementation.
I found the reason.
Because Lancern's implementation uses a very small vector for initialization, like [0.002, ...] and [-0.001, ...]. The gradient is almost the same or even larger compared to the initial vector, like [0.004, ...] and [0.003, ...]. Therefore after only one epoch, which includes several updates, both the vector will be roughly equal to the gradient vectors. And they should be the same because they are the same function and only differ in random walk.
On the other hand, this implementation uses a random vector range from 0 to 1 as the initial vectors, like [0.65, ...] and [0.22, ....]. And the gradient is small compared to the initial vector, like [0.0017, ...] and [0.0012, ...]. Therefore after only one epoch, which includes several updates, the initial vectors are still roughly equal to [0.65, ...] and [0.22, ....]. Because the vectors are not fully trained yet.
In conclusion, if we only train one epoch. In Lancern's implementation, the embedding will be roughly equal to the gradient vector. While in this implementation, the embedding will be roughly equal to a random vector.
It is a question of how to set the learning rate and how to train the model well. Still need some research about how to train model like word2vec and doc2vec well.
As for the question of overfitting, you can see this link here https://www.reddit.com/r/MachineLearning/comments/3rcnfl/overfitting_in_word2vec/ and https://groups.google.com/g/gensim/c/JtUhgUjx4YI. I am not worried about overfitting right now.

from asm2vec-pytorch.

markgllin avatar markgllin commented on June 15, 2024

ahh I understand. In that case, this sounds like a non-issue and more like determining the appropriate parameters. I'll continue tinkering with that. Feel free to close this issue. Thanks!

from asm2vec-pytorch.

oalieno avatar oalieno commented on June 15, 2024

I will push a new commit to change the default setting of the initial vector and learning rate.

from asm2vec-pytorch.

oalieno avatar oalieno commented on June 15, 2024

Oh wait, I see your pull request. Yeah, custom learning rate also work.

from asm2vec-pytorch.

oalieno avatar oalieno commented on June 15, 2024

19e4e25
Default adam learning rate seems a bit small, setting default to 0.02.

from asm2vec-pytorch.

Related Issues (12)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.