wepe / tgboost Goto Github PK
View Code? Open in Web Editor NEWTiny Gradient Boosting Tree
License: MIT License
Tiny Gradient Boosting Tree
License: MIT License
Why pass reg_lambda into the loss class?
I don't see any usage of reg_lambda in the loss class.
其实一个应该是 child 最小 sample 数吧
Thanks for sharing.
The issue in tree.py line:261.
pool = Pool() is missing num_thread args.
大神,能解释下logistic loss的梯度是怎么计算的吗?
单个样本的logistic loss计算公式是loss(y, x) = log(1 + exp(-y * f(x)) )
其中,f(x)是假设函数, 求导是 -1 / (exp(y*f(x)) + 1)
和你代码中的公式grad = (1-y)/(1-pred) - y/pred
对不上那?
是我哪里算错了吗,期待回答
现在的代码逻辑,离散特征也会被装箱(bin) 然后就是按照连续特征来处理了。这样的处理,默认离散特征就是有序的了,所以m个特征值有m-1个箱子。但是如果离散特征是无序的,那么箱子数量应该是指数级的。这个地方是不是可以在优化下那?还是说xgboost就是按照无序来处理的?
参考Spark中决策树对于离散特征的处理会先判断离散特征是有序还是无序的,然后在进行装箱。
才疏学浅,很想把大佬的代码吃透了。。。
这两个命名不是很理解,想不通为什么这样叫。能解释下具体的含义吗?非常感谢~ 最近在研究xgboost想把你的开源代码吃透,自己也写写。就是没什么文档,看不太懂。。
The original code is:
private void initialize_cutting_inds_thresholds(){
cutting_inds = new int[feature_dim][][];
cutting_thresholds = new float[feature_dim][];
for(int i=0;i<feature_dim;i++){
//for this feature, get its cutting index
ArrayList<Integer> list = new ArrayList<>();
int last_index = 0;
for(int j=0;j<attribute_list[i].length;j++){
if(attribute_list[i][j][0]==attribute_list[i][last_index][0]){
last_index = j;
}else {
list.add(last_index);
last_index = j;
}
}
//for this feature,store its cutting threshold
cutting_thresholds[i] = new float[list.size()+1];
for(int t=0;t<cutting_thresholds[i].length-1;t++){
cutting_thresholds[i][t] = attribute_list[i][list.get(t)][0];
}
cutting_thresholds[i][list.size()] = attribute_list[i][list.get(list.size()-1)+1][0];
//for this feature,store inds of each interval
cutting_inds[i] = new int[list.size()+1][]; //list.size()+1 interval
list.add(0,-1);
list.add(attribute_list[i].length-1);
for(int k=0;k<cutting_inds[i].length;k++){
int start_ind = list.get(k)+1;
int end_ind = list.get(k+1);
cutting_inds[i][k] = new int[end_ind-start_ind+1];
for(int m=0;m<cutting_inds[i][k].length;m++){
cutting_inds[i][k][m] = (int) attribute_list[i][start_ind+m][1];
}
}
}
}
Edited code is:
private void initialize_cutting_idx_thresholds() {
cutting_idx = new int[feature_dim][][];
cutting_thresholds = new double[feature_dim][];
for (int i = 0; i < feature_dim; ++i) {
List<Integer> list = new ArrayList<>();
int last_index = -1;
for (int j = 0; j < attribute_list[i].length; ++j) {
if (last_index == -1 || attribute_list[i][j][0] == attribute_list[i][last_index][0]) {
last_index = j;
}
else {
list.add(last_index);
last_index = j;
}
}
cutting_thresholds[i] = new double[list.size()];
for (int t = 0; t < cutting_thresholds[i].length; ++t) {
cutting_thresholds[i][t] = attribute_list[i][list.get(t)][0];
}
cutting_idx[i] = new int[list.size()][];
list.add(attribute_list[i].length);
for (int k = 0; k < cutting_idx[i].length; ++k) {
int s_idx = list.get(k);
int e_idx = list.get(k + 1);
cutting_idx[i][k] = new int[e_idx - s_idx];
for (int m = 0; m < cutting_idx[i][k].length; ++m) {
cutting_idx[i][k][m] = (int)attribute_list[i][s_idx + m][1];
}
}
}
}
another article :https://blog.csdn.net/u010159842/article/details/77503930
it shows the grad and hess formulas of Logistic loss are:
when logistic loss is:
and when I read the source of xgboost, i got the same formulas in code.
so, is there something wrong with your code?
代码里this_threshold = (cur_value + nxt_value) / 2.0 如果当前列是离散值 tgboost是不是还不支持?
另外Y.hess.sum() < self.min_child_weight为何用二阶导数的sum作为判定标准没有想明白。谢谢你。
Hello Cant download dtaset on Baidu
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.