Comments (9)
Hi,
Firstly you should makesure your caffe is built for Python layers. If not open "Makefile.config" in caffe root and uncomment the line “WITH_PYTHON_LAYER := 1”.Rebuild caffe.
Go to tripletloss root path. Add tripletloss root path to train.py.
Change the configs in config.py then caffe can create the network.
Note :Training a network you should read the code first.
from tripletloss.
When I run train.py , an error occurred, " No module named tripletloss.datalayer"。
You said "Add tripletloss root path to train.py", I only changed pathes in main function, is it right?
from tripletloss.
" No module named tripletloss.datalayer"这个错误解决了,把module设置为datalayer就好了。
出现了另外一个问题,“AttributeError: 'DataLayer' object has no attribute 'param_str_'”,我查了caffe的源码也没有找到这个param_str_,这是哪里错了呢?
from tripletloss.
你的python路径没有找到,这几个datalayer都是python写的,需要添加工程的根路径到你的python中
from tripletloss.
不知道楼主的比对结果是多少呢?我看tripletselectlayer里面是把if an > ap:记下来了,应该是ap>an吧?
from tripletloss.
semi-hard的筛选请参见paper的内容。
个人感觉这里应该有一些trick,似乎可以通过设置区间参数来调整样本的筛选,本人也做了些调整,但目前并未发现选择 an > ap + beta 区间,对结果产生的差异性表现
from tripletloss.
恩,恩论文中看到了。tripletselectlayer这一层需要backward吗?总感觉不backward,loss像传不到前面似的。
from tripletloss.
我运行的时候所有的都显示not need backward computation,不知哪里出了问题?
tripletloss does not need backward computation.
I0504 18:03:40.812458 66083 net.cpp:228] triplet_select does not need backward computation.
I0504 18:03:40.812466 66083 net.cpp:228] norm2 does not need backward computation.
I0504 18:03:40.812474 66083 net.cpp:228] drop8 does not need backward computation.
I0504 18:03:40.812482 66083 net.cpp:228] relu8 does not need backward computation.
I0504 18:03:40.812489 66083 net.cpp:228] fc9 does not need backward computation.
I0504 18:03:40.812497 66083 net.cpp:228] drop7 does not need backward computation.
I0504 18:03:40.812505 66083 net.cpp:228] relu7 does not need backward computation.
I0504 18:03:40.812512 66083 net.cpp:228] fc7 does not need backward computation.
I0504 18:03:40.812520 66083 net.cpp:228] drop6 does not need backward computation.
I0504 18:03:40.812528 66083 net.cpp:228] relu6 does not need backward computation.
I0504 18:03:40.812536 66083 net.cpp:228] fc6 does not need backward computation.
I0504 18:03:40.812544 66083 net.cpp:228] pool5 does not need backward computation.
I0504 18:03:40.812557 66083 net.cpp:228] relu5_3 does not need backward computation.
I0504 18:03:40.812566 66083 net.cpp:228] conv5_3 does not need backward computation.
I0504 18:03:40.812573 66083 net.cpp:228] relu5_2 does not need backward computation.
I0504 18:03:40.812582 66083 net.cpp:228] conv5_2 does not need backward computation.
I0504 18:03:40.812590 66083 net.cpp:228] relu5_1 does not need backward computation.
I0504 18:03:40.812597 66083 net.cpp:228] conv5_1 does not need backward computation.
I0504 18:03:40.812605 66083 net.cpp:228] pool4 does not need backward computation.
I0504 18:03:40.812613 66083 net.cpp:228] relu4_3 does not need backward computation.
I0504 18:03:40.812621 66083 net.cpp:228] conv4_3 does not need backward computation.
I0504 18:03:40.812629 66083 net.cpp:228] relu4_2 does not need backward computation.
I0504 18:03:40.812638 66083 net.cpp:228] conv4_2 does not need backward computation.
I0504 18:03:40.812662 66083 net.cpp:228] relu4_1 does not need backward computation.
I0504 18:03:40.812672 66083 net.cpp:228] conv4_1 does not need backward computation.
I0504 18:03:40.812680 66083 net.cpp:228] pool3 does not need backward computation.
I0504 18:03:40.812688 66083 net.cpp:228] relu3_3 does not need backward computation
from tripletloss.
@GuitarZhang 您好,我也遇到了您的问题,AttributeError: 'DataLayer' object has no attribute 'param_str_;请问您如何解决这个问题的?具体过程有吗?
from tripletloss.
Related Issues (20)
- the module name is right,but it still failed with boost::python::error_already_set
- 训练到后面 loss都变为0.1 an,ap变为0 HOT 6
- training problem HOT 5
- training problem
- Hard Sample
- the triplet loss architecture is unsupervised, is this correct?
- tipletselectlayer - computing the distance against the anchor image HOT 2
- shuffle
- fc9_1 weights is increase with the train iterations from 0.0x to 40.x HOT 1
- Training your code on custom dataset HOT 1
- how to train it on 2 gpus
- Use of the margin HOT 2
- Negative mining in TripletSelectLayer HOT 1
- what does no_residual_list for?
- 数据集组织需要什么特别处理么? HOT 1
- online triplet sample selection usage?
- 您好,我想用自己的数据集跑这个网络,除了修改config路径,还需要修改哪里? HOT 2
- Online triplet generation HOT 1
- 工作推荐
- ap,an都变得特别大
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tripletloss.