Comments (6)
Thanks for your interest in our paper. I agree that you can design your experiments in a variety of ways. Regarding your first question, you can imagine what we did is an almost 50-50 split. As you understand correctly, we have two states, one observed state that is created by first removal and one training state that is created by second removal. For your second question, we tried to make sure not to have leakage from the training set to the unseen set. Does that make sense?
from optimallinkprediction.
No, I think that's fine too.
from optimallinkprediction.
Thanks for the quick reply.
I'm trying my best to understand the trade-offs with experiment design options. Different papers I'm reading appear to be doing different things.
Yes, the leakage thing makes sense (if I'm understanding correctly, you mean leakage between training and test sets). But, if you sample 20% without replacement twice, independently, won't the overlap be small?
I've been trying out the method in your paper. I remove 20% of the total edges for the testing set, and then another 20% of the total edges for the training set. But, I'm getting a strange situation where it seems like performance on the test graph is higher than my average 5-fold CV performance (AUROC and AP) while tuning (on the train graph).
I thought about it a bit and I realized: when we remove edges aren't we destroying the graph's structure? The test graph has only a bit of structure destroyed, but the training graph has more structure destroyed. So, isn't the edge prediction in the training graph 'harder' than on the test graph?
Instead of sampling twice independently, couldn't we sample 20% of edges, and then sample a disjoint set of 20% of edges? Then, the training graph and test graphs would have roughly equal amounts of 'destroyed structure', but no leakage?
Would this make sense?
from optimallinkprediction.
No, I didn't have this issue. I even have to control overfitting in my experiments. Would you please explain more about your experiments? What kind of algorithms you are going to stack? Are you using the same networks released in this page?
from optimallinkprediction.
@Aghasemian Ah, the problem I had was just a bug. Mine is also slightly overfitting now too. Even putting that aside, is there anything wrong with the setup I mentioned?
Instead of sampling twice independently, couldn't we sample 20% of edges, and then sample a disjoint set of 20% of edges? Then, the training graph and test graphs would have roughly equal amounts of 'destroyed structure', but no leakage?
from optimallinkprediction.
Thank you for all the help
from optimallinkprediction.
Related Issues (2)
- SVD truncation HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from optimallinkprediction.