Giter Site home page Giter Site logo

spp_net's Introduction

SPP_net: spatial pyramid pooling in deep convolutional networks for visual recognition

Acknowledgements: a huge thanks to Yangqing Jia for creating Caffe and the BVLC team, and to Ross Girshick for creating RCNN

Introduction

This is a re-implementation of the object detection algorithm described in the ECCV 2014 paper "Spatial Pyramid Pooling in Deep Convolutional Networks for Visual Recognition". This re-implementation should reproduce the object detection results reported in the paper up to some statistical variance. The models used in the paper are trained/fine-tuned using cuda-convnet, while the model attached with this code is trained/fine-tuned using Caffe, for the ease of code release.

The implementation of image classification training/testing has not been included, but the network configuration files can be found directly in this code.

Please contact [email protected] or [email protected] if you have any question.

Citing SPP_net

If you find SPP_net useful in your research, please consider citing:

@inproceedings{kaiming14ECCV,
    Author = {Kaiming, He and Xiangyu, Zhang and Shaoqing, Ren and Jian Sun},
    Title = {Spatial pyramid pooling in deep convolutional networks for visual recognition},
    Booktitle = {European Conference on Computer Vision},
    Year = {2014}
}

License

SPP_net is released under the Simplified BSD License for non-commercial use (refer to the LICENSE file for details).

Installing SPP_net

  1. Prerequisites
  2. MATLAB (tested with 2014a on 64-bit Windows)
  3. Caffe's prerequisites (some function is based our modified caffe, so we provied compiled caffe mex and cpp file for mex wapper), run external\fetch_caffe_mex_5_5.m to download
  4. News: a caffe version which supports spp mex is provided in https://github.com/ShaoqingRen/caffe/tree/SPP_net, this version is forked from BVLC/caffe on Oct. 1, 2014. For the caffe.mex compiled from this caffe code, the Zeiler CNN network with compatible structure (shared in OneDrive ), and new prototxts for finetune (in ./model-defs) should be used.
  5. Install SPP_net
  6. Get the SPP_net source code by cloning the repository: git clone https://github.com/ShaoqingRen/SPP_net.git
  7. Now change into the SPP_net source code directory
  8. SPP_net expects to find Caffe in external/caffe
  9. Start MATLAB (make sure you're still in the spp directory): matlab
  10. You'll be prompted to download the Selective Search code, which we cannot redistribute. Afterwards, you should see the message SPP_net startup done followed by the MATLAB prompt >>.
  11. Run the build script: >> spp_build() (builds liblinear, Selective Search, spp_pool and nms). Don't worry if you see compiler warnings while building liblinear, this is normal on my system.
  12. Download the model package by run external\fetch_model_data.m

Training your own SPP_net detector on PASCAL VOC

Let's use PASCAL VOC 2007 as an example. The basic pipeline is:

extract features to disk -> finetune -> train SVMs -> test

You'll need about 20GB of disk space free for the feature cache (which is stored in feat_cache by default. It's best if the feature cache is on a fast, local disk.

An one click script is experiments\Script_spp_voc.m

spp_net's People

Contributors

eendebakpt avatar shaoqingren avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spp_net's Issues

spp_build () Compile Error

Hi, I am a researcher that need spp-net to make our detection fast.
But when I compile, I got following error messages.
Compiling spm_pool_caffe_mex
utils/spm_pool/spm_pool_caffe_mex.cpp: In function ‘void mexFunction(int, mxArray**, int, const m xArray**)’:
utils/spm_pool/spm_pool_caffe_mex.cpp:101: error: ‘floor’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:104: error: ‘ceil’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:132: error: ‘memset’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:134: error: ‘_aligned_malloc’ was not declared in this scop e
utils/spm_pool/spm_pool_caffe_mex.cpp:168: error: ‘floor’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:169: error: ‘ceil’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:184: error: expected initializer before ‘*’ token
utils/spm_pool/spm_pool_caffe_mex.cpp:185: error: ‘__m128’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:185: error: ‘pooled_this_div_sse’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:185: error: expected primary-expression before ‘)’ token
utils/spm_pool/spm_pool_caffe_mex.cpp:185: error: expected ‘;’ before ‘pooled_this_div_cache’
utils/spm_pool/spm_pool_caffe_mex.cpp:189: error: ‘feats_this_sse’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:189: error: ‘_mm_max_ps’ was not declared in this scope
utils/spm_pool/spm_pool_caffe_mex.cpp:218: error: ‘_aligned_free’ was not declared in this scope

mex: compile of ' "utils/spm_pool/spm_pool_caffe_mex.cpp"' failed.

Unable to complete successfully.

I have cleared the first 3 error message by include <math.h> and <string.h>.
I think others are sse command set compile errors.
I am compile in Matlab R2013b, is that the problem?

Caltech results

Hi,

I was able to run the Pascal detection code of SPP_net - and it worked great ! ~ Thanks :)

I wanted to try classification on the caltech101 dataset, do you by any chance have the code for that remaining ? (Couldn't find it in this repo)

new net in matcaffe failed

I found that when I use image_data_layer as input, the caffe failed to read protofile. And in the matcaffe, new net(string(param_file)) failed too.

The script_spp_voc.m met some error for prototxt version

Hi, Shaoqing

I download the SPP package and the corresponding caffe package a few days ago. Now I have some problem in running Script_spp_voc.m

If I use the original pascal_finetune_fc_spm_solver.prototxt, the matlab crashed with an error message of

WARNING: Logging before InitGoogleLogging() is written to STDERR
I1123 15:46:26.743105 9641 common.cpp:222] Compute Capability 3.5, set cuda_num_threads = 1024
libprotobuf ERROR google/protobuf/text_format.cc:172] Error parsing text-format caffe.NetParameter: 17:23: Message type "caffe.V0LayerParameter" has no field named "inner_product_param".
F1123 15:46:30.807468 9641 upgrade_proto.cpp:627] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: pascal_finetune_fc_spm_train.prototxt

If I changed to the other prototxt pascal_finetune_fc_spm_solver_new.prototxt, the matlab also crashed. Either or not I changed the prototxt following the 5th post.

So which one is the right version for training the VOC using current caffe version?
Could you help solve it?

Thanks.

Request for Complete Zeiler Model

Hi, Shaoqing

May I have your complete Zeiler Model for evaluating the performance using caffe? Since your current Zeiler_conv5 seems only have the parameters from conv1 to conv5, it cannot be directly used in caffe.

Thanks a lot.

How can I use SPP_net for classification

Hi,
According the paper of SPP_net, it performs well on image classification. How can I train the model and test for classification task on VOC2007 and Caltch101?
Thank you!

Is the feat in spp_detect.m the final feature used in fully connected layer?

RT
In spp_detect.m line 40 to line 43:

feat = spp_features_convX(im, spm_im_size, [], use_gpu);
feat = spp_features_convX_to_poolX(spp_model.spp_pooler, feat, boxes, false);
feat = spp_poolX_to_fcX(feat, spp_model.training_opts.layer, spp_model, use_gpu);
feat = spp_scale_features(feat, spp_model.training_opts.feat_norm_mean);

Is the feat the final feature used in fully connected layer?
In a word, can I use the feat as the extracted feature in other ways?

Out of memory

hi,
my computer has 8GB memory. when i trained detector with VOC dataset,MATLAB throw error "out of memory ". i added 4GB memory, but error still.
how much memory it would be needed? 16GB or more?

It takes too long to finetune

Hello, @ShaoqingRen, I started the one click script 'experiments\Script_spp_voc.m' 7 days ago after downloading dataset needed. But, until now, the matlab is still running at the finetuning stage...I have a nvidia gtx770 on my computer.
Would you please tell how long does take to finish running the 'experiments\Script_spp_voc.m' on your computer and the hardware of your computer?

spp_model.mat not exist !

Hello, doctor Kaiming He:
When I run the "spp_demo.m" file, the following problems are encounted.

Error using spp_demo(line 5)
.\data\spp_model\VOC2007\spp_model.mat not exist !

The ".\data" directory exists, but the sub-dierectory "\spp_model\VOC2007\spp_model.mat" not exists. How could I get the file "spp_model.mat"?

Could you help me? Thank you very much.

compile error with custom matcaffe.cpp

In the customized matcaffe.cpp there are two includes that cannot be found on my system:

include "caffe/util/Directory.h"

include "caffe/util/Path.h"

I cannot find Directory.h anywhere in the caffe system. Are these files from SPP? If so, can they be added to the SPP code or (even better) can the customized matcaffe.cpp be included in caffe itself.

box regression error

Dear Shaoqing,
I run the SPP in linux, most of it run smoothly in script_spp_voc.
However, I got following error in box regression when I set the utilizing layer as 6 or 7:
feature stats: 1/200
Cell contents reference from a non-cell array object.

Error in spp_poolX_to_fcX (line 39)
feat_gpu = max(0, bsxfun(@plus,
spp_model.cnn.layers(i).weights_gpu{1} * feat_gpu, ...

Error in spp_feature_stats (line 58)
X = spp_poolX_to_fcX(X, layer, spp_model, conf.use_gpu);

Error in spp_train_bbox_regressor (line 60)
opts.feat_norm_mean = spp_feature_stats(imdb, roidb, opts.layer,
spp_model);

Error in spp_exp_bbox_reg_train_and_test_voc (line 37)
bbox_reg = spp_train_bbox_regressor(opts.imdb_train,
opts.roidb_train, ld.spp_model, ...

Error in Script_spp_voc (line 83)
spp_exp_bbox_reg_train_and_test_voc(opts);

Thanks for your help.
Regards
Jackie

Error using caffe: API command not recognized

Hi,

When I run spp_demo from Matlab, I get the following error. Hope anyone can help to solve it.
p/s: I already installed and tested caffe successfully.

spp_demo
Computing candidate regions...found 2034 candidates (in 1.535s).
Extracting CNN features from regions...Error using caffe
API command not recognized

Error in spp_features_convX (line 61)
caffe('set_gpu_available');

Error in spp_detect (line 40)
feat = spp_features_convX(im, spm_im_size, [], use_gpu);

Error in spp_demo (line 40)
dets = spp_detect(im, spp_model, spm_im_size, use_gpu);

Thank you very much,

matcaffe in Linux

Hi, Shaoqing

I have tried your code in Linux Ubuntu12.04. Since your provided matcaffe is for windows, I have downloaded the caffe package from https://github.com/ShaoqingRen/caffe/archive/master.zip

After built all the spp components, caffe and matcaffe, I tried to run the spp_demo, but I encounter the following problem.

^[zlibprotobuf ERROR google/protobuf/text_format.cc:172] Error parsing text-format caffe.NetParameter: 14:14: Message type "caffe.V0LayerParameter" has no field named "conv_param".
WARNING: Logging before InitGoogleLogging() is written to STDERR
F1121 20:39:26.828352 28307 upgrade_proto.cpp:627] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: /media/Backup/SPP_net-master/data/cnn_model/Zeiler_conv5/Zeiler_spm_scale224_test_conv5.prototxt
*** Check failure stack trace: ***
Killed

It seems the provided caffe is not suitable for your prototxt?

Could you kindly help solving this problem?

Thanks.

spp_demo crashes at second time

Hi @ShaoqingRen

I have some discussions elsewhere, but I post here separately.

When I run spp_demo for the first time, it done welll. However, when I try spp_demo in the second time, Matlab crashes.

I log some printed lines and recognize that g = gpuDevice(1); failed, so I guess the GPU device has still not released after the first run.

Hope you can help to debug this error.

Thank you,

Can't open Matlab inside SPP_net

When I run matlab in the SPP_net root directory, it prints Opening log file: /Users/rose/java.log.3323 and just hangs for >10min. How can I debug this problem? Matlab opens just fine when I run matlab from other directories.

How to run spp_net with caffe (compiled from source) on Ubuntu

Hi Shaoqing,

I intend to compile and run spp_net on Ubuntu (because I'm running rcnn on Ubuntu too).

To do this, I need to compile caffe from source, not using the caffe.mexw64 that you provided on windows.

But when I compile caffe (https://github.com/ShaoqingRen/caffe), there are files missing such as "caffe/util/Directory.h" and probably others (because you already modified it).

Please give me some suggestions so that I can run spp_net on Ubuntu.

Thank you very much,

crash problem

Hi Shaoqing,
when I run the SPP_net / experiments / Script_spp_voc.m, it can extract features from images.
However, it crashed when it ran for a number of images. This happened occasionally.
Have you met this problem before ?
Following is the error information, but I found it with little help.
Regards

Jackie

          abort() detected at Sat Nov 29 16:18:49 2014

Configuration:
Crash Decoding : Disabled
Current Visual : 0x24 (class 4, depth 24)
Default Encoding : UTF-8
GNU C Library : 2.15 stable
MATLAB Architecture: glnxa64
MATLAB Root : /usr/local/MATLAB/R2013b
MATLAB Version : 8.2.0.701 (R2013b)
Operating System : Linux 3.8.0-38-generic #56~precise1-Ubuntu SMP Thu Mar 13 16:22:48 UTC 2014 x86_64
Processor ID : x86 Family 6 Model 45 Stepping 7, GenuineIntel
Virtual Machine : Java 1.7.0_11-b21 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
Window System : The XFree86 Project, Inc (40300000), display :12.0

Fault Count: 1

Abnormal termination:
abort()

This error was detected while a MEX-file was running. If the MEX-file
is not an official MathWorks function, please examine its source code
for errors. Please consult the External Interfaces Guide for information
on debugging MEX-files.

Error in imdb_from_voc

Get the following error when running Script_spp_voc (The error is in imdb_from_voc)

Error using fliplr (line 18)
X must be a 2-D matrix.

Error in imdb_from_voc (line 55)
             imwrite(fliplr(im), flip_image_at(i));

Error in Script_spp_voc>perpare_train_data (line 89)
    opts.imdb_train             = {  imdb_from_voc(opts.devkit, 'trainval', '2007', flip) };

Error in Script_spp_voc (line 46)
opts                        = perpare_train_data(opts, opts.flip | opts.flip_finetune);

When I check the data, the image is a 3D matrix.

OS platform

Hi Shaoqing,
Could the SPP net support the linux? Since I run the caffe in pre-requirement, I got the following make error:
./include/caffe/syncedmem.hpp: In constructor ‘caffe::SyncedMemory::SyncedMemory(size_t)’:
./include/caffe/syncedmem.hpp:67:10: warning: ‘caffe::SyncedMemory::size_’ will be initialized after [-Wreorder]
./include/caffe/syncedmem.hpp:65:10: warning: ‘size_t caffe::SyncedMemory::cpu_capacity_’ [-Wreorder]
./include/caffe/syncedmem.hpp:43:12: warning: when initialized here [-Wreorder]
python/caffe/_caffe.cpp: In member function ‘void caffe::PyNet::set_input_arrays(boost::python::api::object, boost::python::api::object)’:
python/caffe/_caffe.cpp:96:14: error: ‘MemoryDataLayer’ was not declared in this scope
python/caffe/_caffe.cpp:96:35: error: template argument 1 is invalid
python/caffe/_caffe.cpp:96:37: error: expected unqualified-id before ‘>’ token
python/caffe/_caffe.cpp:98:8: error: ‘md_layer’ was not declared in this scope
python/caffe/_caffe.cpp:108:50: error: ‘md_layer’ was not declared in this scope
make: *** [python/caffe/_caffe.so] Error 1

compile issue

I was able to compile the newest version of caffe (https://github.com/BVLC/caffe). However, to use SPP-Net, I am trying to compile caffe (https://github.com/ShaoqingRen/caffe) on Ubuntu 14.04 but I got the following error. Please Help,

src/caffe/test/test_data_layer.cpp: In member function ‘void caffe::DataLayerTest::FillLMDB(bool)’:
src/caffe/test/test_data_layer.cpp:73:5: error: ‘MDB_env’ was not declared in this scope
MDB_env env;
^
src/caffe/test/test_data_layer.cpp:73:14: error: ‘env’ was not declared in this scope
MDB_env *env;
^
src/caffe/test/test_data_layer.cpp:74:5: error: ‘MDB_dbi’ was not declared in this scope
MDB_dbi dbi;
^
src/caffe/test/test_data_layer.cpp:74:13: error: expected ‘;’ before ‘dbi’
MDB_dbi dbi;
^
src/caffe/test/test_data_layer.cpp:75:5: error: ‘MDB_val’ was not declared in this scope
MDB_val mdbkey, mdbdata;
^
src/caffe/test/test_data_layer.cpp:75:13: error: expected ‘;’ before ‘mdbkey’
MDB_val mdbkey, mdbdata;
^
src/caffe/test/test_data_layer.cpp:76:5: error: ‘MDB_txn’ was not declared in this scope
MDB_txn *txn;
^
src/caffe/test/test_data_layer.cpp:76:14: error: ‘txn’ was not declared in this scope
MDB_txn *txn;
^
In file included from ./include/caffe/common.hpp:6:0,
from ./include/caffe/blob.hpp:4,
from src/caffe/test/test_data_layer.cpp:7:
src/caffe/test/test_data_layer.cpp:77:33: error: there are no arguments to ‘mdb_env_create’ that depend on a template parameter, so a declaration of ‘mdb_env_create’ must be available [-fpermissive]
CHECK_EQ(mdb_env_create(&env), MDB_SUCCESS) << "mdb_env_create failed";
^
src/caffe/test/test_data_layer.cpp:77:33: note: (if you use ‘-fpermissive’, G++ will accept your code, but allowing the use of an undeclared name is deprecated)
src/caffe/test/test_data_layer.cpp:77:36: error: ‘MDB_SUCCESS’ was not declared in this scope
CHECK_EQ(mdb_env_create(&env), MDB_SUCCESS) << "mdb_env_create failed";
^
src/caffe/test/test_data_layer.cpp:78:52: error: there are no arguments to ‘mdb_env_set_mapsize’ that depend on a template parameter, so a declaration of ‘mdb_env_set_mapsize’ must be available [-fpermissive]
CHECK_EQ(mdb_env_set_mapsize(env, 1099511627776), MDB_SUCCESS) // 1TB
^
src/caffe/test/test_data_layer.cpp:78:55: error: ‘MDB_SUCCESS’ was not declared in this scope
CHECK_EQ(mdb_env_set_mapsize(env, 1099511627776), MDB_SUCCESS) // 1TB
^
src/caffe/test/test_data_layer.cpp:80:59: error: there are no arguments to ‘mdb_env_open’ that depend on a template parameter, so a declaration of ‘mdb_env_open’ must be available [-fpermissive]
CHECK_EQ(mdb_env_open(env, filename
->c_str(), 0, 0664), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:80:62: error: ‘MDB_SUCCESS’ was not declared in this scope
CHECK_EQ(mdb_env_open(env, filename_->c_str(), 0, 0664), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:82:46: error: there are no arguments to ‘mdb_txn_begin’ that depend on a template parameter, so a declaration of ‘mdb_txn_begin’ must be available [-fpermissive]
CHECK_EQ(mdb_txn_begin(env, NULL, 0, &txn), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:82:49: error: ‘MDB_SUCCESS’ was not declared in this scope
CHECK_EQ(mdb_txn_begin(env, NULL, 0, &txn), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:84:38: error: ‘dbi’ was not declared in this scope
CHECK_EQ(mdb_open(txn, NULL, 0, &dbi), MDB_SUCCESS) << "mdb_open failed";
^
src/caffe/test/test_data_layer.cpp:84:41: error: there are no arguments to ‘mdb_open’ that depend on a template parameter, so a declaration of ‘mdb_open’ must be available [-fpermissive]
CHECK_EQ(mdb_open(txn, NULL, 0, &dbi), MDB_SUCCESS) << "mdb_open failed";
^
src/caffe/test/test_data_layer.cpp:84:44: error: ‘MDB_SUCCESS’ was not declared in this scope
CHECK_EQ(mdb_open(txn, NULL, 0, &dbi), MDB_SUCCESS) << "mdb_open failed";
^
src/caffe/test/test_data_layer.cpp:102:7: error: ‘mdbdata’ was not declared in this scope
mdbdata.mv_size = value.size();
^
src/caffe/test/test_data_layer.cpp:105:7: error: ‘mdbkey’ was not declared in this scope
mdbkey.mv_size = keystr.size();
^
In file included from ./include/caffe/common.hpp:6:0,
from ./include/caffe/blob.hpp:4,
from src/caffe/test/test_data_layer.cpp:7:
src/caffe/test/test_data_layer.cpp:107:29: error: ‘dbi’ was not declared in this scope
CHECK_EQ(mdb_put(txn, dbi, &mdbkey, &mdbdata, 0), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:107:54: error: there are no arguments to ‘mdb_put’ that depend on a template parameter, so a declaration of ‘mdb_put’ must be available [-fpermissive]
CHECK_EQ(mdb_put(txn, dbi, &mdbkey, &mdbdata, 0), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:107:57: error: ‘MDB_SUCCESS’ was not declared in this scope
CHECK_EQ(mdb_put(txn, dbi, &mdbkey, &mdbdata, 0), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:110:32: error: there are no arguments to ‘mdb_txn_commit’ that depend on a template parameter, so a declaration of ‘mdb_txn_commit’ must be available [-fpermissive]
CHECK_EQ(mdb_txn_commit(txn), MDB_SUCCESS) << "mdb_txn_commit failed";
^
src/caffe/test/test_data_layer.cpp:110:35: error: ‘MDB_SUCCESS’ was not declared in this scope
CHECK_EQ(mdb_txn_commit(txn), MDB_SUCCESS) << "mdb_txn_commit failed";
^
src/caffe/test/test_data_layer.cpp:111:20: error: ‘dbi’ was not declared in this scope
mdb_close(env, dbi);
^
src/caffe/test/test_data_layer.cpp:111:23: error: there are no arguments to ‘mdb_close’ that depend on a template parameter, so a declaration of ‘mdb_close’ must be available [-fpermissive]
mdb_close(env, dbi);
^
src/caffe/test/test_data_layer.cpp:112:22: error: there are no arguments to ‘mdb_env_close’ that depend on a template parameter, so a declaration of ‘mdb_env_close’ must be available [-fpermissive]
mdb_env_close(env);
^
In file included from ./include/caffe/common.hpp:6:0,
from ./include/caffe/blob.hpp:4,
from src/caffe/test/test_data_layer.cpp:7:
src/caffe/test/test_data_layer.cpp: In instantiation of ‘void caffe::DataLayerTest::FillLMDB(bool) [with TypeParam = caffe::DoubleGPU]’:
src/caffe/test/test_data_layer.cpp:400:3: required from ‘void caffe::DataLayerTest_TestReadCropTestLMDB_Test<gtest_TypeParam_>::TestBody() [with gtest_TypeParam_ = caffe::DoubleGPU]’
src/caffe/test/test_data_layer.cpp:404:1: required from here
src/caffe/test/test_data_layer.cpp:77:33: error: ‘mdb_env_create’ was not declared in this scope
CHECK_EQ(mdb_env_create(&env), MDB_SUCCESS) << "mdb_env_create failed";
^
src/caffe/test/test_data_layer.cpp:78:52: error: ‘mdb_env_set_mapsize’ was not declared in this scope
CHECK_EQ(mdb_env_set_mapsize(env, 1099511627776), MDB_SUCCESS) // 1TB
^
src/caffe/test/test_data_layer.cpp:80:59: error: ‘mdb_env_open’ was not declared in this scope
CHECK_EQ(mdb_env_open(env, filename_->c_str(), 0, 0664), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:82:46: error: ‘mdb_txn_begin’ was not declared in this scope
CHECK_EQ(mdb_txn_begin(env, NULL, 0, &txn), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:84:41: error: ‘mdb_open’ was not declared in this scope
CHECK_EQ(mdb_open(txn, NULL, 0, &dbi), MDB_SUCCESS) << "mdb_open failed";
^
src/caffe/test/test_data_layer.cpp:107:54: error: ‘mdb_put’ was not declared in this scope
CHECK_EQ(mdb_put(txn, dbi, &mdbkey, &mdbdata, 0), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:110:32: error: ‘mdb_txn_commit’ was not declared in this scope
CHECK_EQ(mdb_txn_commit(txn), MDB_SUCCESS) << "mdb_txn_commit failed";
^
src/caffe/test/test_data_layer.cpp:111:23: error: ‘mdb_close’ was not declared in this scope
mdb_close(env, dbi);
^
src/caffe/test/test_data_layer.cpp:112:22: error: ‘mdb_env_close’ was not declared in this scope
mdb_env_close(env);
^
In file included from ./include/caffe/common.hpp:6:0,
from ./include/caffe/blob.hpp:4,
from src/caffe/test/test_data_layer.cpp:7:
src/caffe/test/test_data_layer.cpp: In instantiation of ‘void caffe::DataLayerTest::FillLMDB(bool) [with TypeParam = caffe::FloatGPU]’:
src/caffe/test/test_data_layer.cpp:400:3: required from ‘void caffe::DataLayerTest_TestReadCropTestLMDB_Test<gtest_TypeParam_>::TestBody() [with gtest_TypeParam_ = caffe::FloatGPU]’
src/caffe/test/test_data_layer.cpp:404:1: required from here
src/caffe/test/test_data_layer.cpp:77:33: error: ‘mdb_env_create’ was not declared in this scope
CHECK_EQ(mdb_env_create(&env), MDB_SUCCESS) << "mdb_env_create failed";
^
src/caffe/test/test_data_layer.cpp:78:52: error: ‘mdb_env_set_mapsize’ was not declared in this scope
CHECK_EQ(mdb_env_set_mapsize(env, 1099511627776), MDB_SUCCESS) // 1TB
^
src/caffe/test/test_data_layer.cpp:80:59: error: ‘mdb_env_open’ was not declared in this scope
CHECK_EQ(mdb_env_open(env, filename_->c_str(), 0, 0664), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:82:46: error: ‘mdb_txn_begin’ was not declared in this scope
CHECK_EQ(mdb_txn_begin(env, NULL, 0, &txn), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:84:41: error: ‘mdb_open’ was not declared in this scope
CHECK_EQ(mdb_open(txn, NULL, 0, &dbi), MDB_SUCCESS) << "mdb_open failed";
^
src/caffe/test/test_data_layer.cpp:107:54: error: ‘mdb_put’ was not declared in this scope
CHECK_EQ(mdb_put(txn, dbi, &mdbkey, &mdbdata, 0), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:110:32: error: ‘mdb_txn_commit’ was not declared in this scope
CHECK_EQ(mdb_txn_commit(txn), MDB_SUCCESS) << "mdb_txn_commit failed";
^
src/caffe/test/test_data_layer.cpp:111:23: error: ‘mdb_close’ was not declared in this scope
mdb_close(env, dbi);
^
src/caffe/test/test_data_layer.cpp:112:22: error: ‘mdb_env_close’ was not declared in this scope
mdb_env_close(env);
^
In file included from ./include/caffe/common.hpp:6:0,
from ./include/caffe/blob.hpp:4,
from src/caffe/test/test_data_layer.cpp:7:
src/caffe/test/test_data_layer.cpp: In instantiation of ‘void caffe::DataLayerTest::FillLMDB(bool) [with TypeParam = caffe::DoubleCPU]’:
src/caffe/test/test_data_layer.cpp:400:3: required from ‘void caffe::DataLayerTest_TestReadCropTestLMDB_Test<gtest_TypeParam_>::TestBody() [with gtest_TypeParam_ = caffe::DoubleCPU]’
src/caffe/test/test_data_layer.cpp:404:1: required from here
src/caffe/test/test_data_layer.cpp:77:33: error: ‘mdb_env_create’ was not declared in this scope
CHECK_EQ(mdb_env_create(&env), MDB_SUCCESS) << "mdb_env_create failed";
^
src/caffe/test/test_data_layer.cpp:78:52: error: ‘mdb_env_set_mapsize’ was not declared in this scope
CHECK_EQ(mdb_env_set_mapsize(env, 1099511627776), MDB_SUCCESS) // 1TB
^
src/caffe/test/test_data_layer.cpp:80:59: error: ‘mdb_env_open’ was not declared in this scope
CHECK_EQ(mdb_env_open(env, filename_->c_str(), 0, 0664), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:82:46: error: ‘mdb_txn_begin’ was not declared in this scope
CHECK_EQ(mdb_txn_begin(env, NULL, 0, &txn), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:84:41: error: ‘mdb_open’ was not declared in this scope
CHECK_EQ(mdb_open(txn, NULL, 0, &dbi), MDB_SUCCESS) << "mdb_open failed";
^
src/caffe/test/test_data_layer.cpp:107:54: error: ‘mdb_put’ was not declared in this scope
CHECK_EQ(mdb_put(txn, dbi, &mdbkey, &mdbdata, 0), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:110:32: error: ‘mdb_txn_commit’ was not declared in this scope
CHECK_EQ(mdb_txn_commit(txn), MDB_SUCCESS) << "mdb_txn_commit failed";
^
src/caffe/test/test_data_layer.cpp:111:23: error: ‘mdb_close’ was not declared in this scope
mdb_close(env, dbi);
^
src/caffe/test/test_data_layer.cpp:112:22: error: ‘mdb_env_close’ was not declared in this scope
mdb_env_close(env);
^
In file included from ./include/caffe/common.hpp:6:0,
from ./include/caffe/blob.hpp:4,
from src/caffe/test/test_data_layer.cpp:7:
src/caffe/test/test_data_layer.cpp: In instantiation of ‘void caffe::DataLayerTest::FillLMDB(bool) [with TypeParam = caffe::FloatCPU]’:
src/caffe/test/test_data_layer.cpp:400:3: required from ‘void caffe::DataLayerTest_TestReadCropTestLMDB_Test<gtest_TypeParam_>::TestBody() [with gtest_TypeParam_ = caffe::FloatCPU]’
src/caffe/test/test_data_layer.cpp:404:1: required from here
src/caffe/test/test_data_layer.cpp:77:33: error: ‘mdb_env_create’ was not declared in this scope
CHECK_EQ(mdb_env_create(&env), MDB_SUCCESS) << "mdb_env_create failed";
^
src/caffe/test/test_data_layer.cpp:78:52: error: ‘mdb_env_set_mapsize’ was not declared in this scope
CHECK_EQ(mdb_env_set_mapsize(env, 1099511627776), MDB_SUCCESS) // 1TB
^
src/caffe/test/test_data_layer.cpp:80:59: error: ‘mdb_env_open’ was not declared in this scope
CHECK_EQ(mdb_env_open(env, filename_->c_str(), 0, 0664), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:82:46: error: ‘mdb_txn_begin’ was not declared in this scope
CHECK_EQ(mdb_txn_begin(env, NULL, 0, &txn), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:84:41: error: ‘mdb_open’ was not declared in this scope
CHECK_EQ(mdb_open(txn, NULL, 0, &dbi), MDB_SUCCESS) << "mdb_open failed";
^
src/caffe/test/test_data_layer.cpp:107:54: error: ‘mdb_put’ was not declared in this scope
CHECK_EQ(mdb_put(txn, dbi, &mdbkey, &mdbdata, 0), MDB_SUCCESS)
^
src/caffe/test/test_data_layer.cpp:110:32: error: ‘mdb_txn_commit’ was not declared in this scope
CHECK_EQ(mdb_txn_commit(txn), MDB_SUCCESS) << "mdb_txn_commit failed";
^
src/caffe/test/test_data_layer.cpp:111:23: error: ‘mdb_close’ was not declared in this scope
mdb_close(env, dbi);
^
src/caffe/test/test_data_layer.cpp:112:22: error: ‘mdb_env_close’ was not declared in this scope
mdb_env_close(env);
^
make: *_* [.build_release/src/caffe/test/test_data_layer.o] Error 1

Errors running spp_demo

I am using the caffe version you provided : https://github.com/ShaoqingRen/caffe

I tried running spp_demo with a use_gpu = false.
I first got the error caffe('set_gpu_forbid'); saying caffe did not recognize it, so i commented it (as there was a line above it saying caffe('set_mode_cpu');. This was in the file spp_features_convX

Then it passed that point, and now gives an error : caffe('set_input_size', varargin{:}); in caffe_anysize_test and nit says cafe doesn't know that command.

Again : I am using your caffe, and I have compiled it without CUDNN but with CUDA.

Any idea on how I could approach debugging this ?

I have two questions about finetune on voc2007

@ShaoqingRen ,
First, I have read the code of SPP_net, i got a question about the finetune.Why does the finetune do on the trainval and test database, but not on the train and val database? I think the test databas can be used only once.
Second, I have run the "spp_demo",everything is ok . This is my parameters.
caffe_net_file = fullfile(pwd, 'data/Zeiler_conv5_new/Zeiler_conv5');
caffe_net_def_file = fullfile(pwd, 'data/Zeiler_conv5_new/Zeiler_spm_scale224_test_conv5.prototxt');

when I run the 'Script_spp_voc.m', there is something wrong .I have change nothing except the net parameters.
This is my net parameters:
opts.net_file = fullfile(pwd, 'data/Zeiler_conv5_new/Zeiler_conv5');
opts.net_def_file = fullfile(pwd,'data/Zeiler_conv5_new/Zeiler_spm_scale224_test_conv5.prototxt');
opts.spp_params_def = fullfile(pwd, 'data/Zeiler_conv5_new/spp_config');
opts.finetune_net_def_file = fullfile(pwd, 'model-defs/pascal_finetune_fc_spm_solver_new.prototxt');
Everything is ok,but until '' rst = caffe('train', {data_train{1}, label_train{1}})''.Matlab closed,but there is no error information .So I have do some tests. I put the "rst = caffe('test', {data_test{1}, label_test{1}})" before rst = caffe('train', {data_train{1}, label_train{1}}). The test of caffe can run , but the train of caffe also can't run.
Thanks!

Compiling liblinear via Matlab on mac

liblinear depends on OpenMP, which gcc4.9 supports. I tried to make sure Matlab uses gcc4.9, and some evidence indicates it does, and some that it doesn't. This is the error I find:

external/liblinear-1.93_multicore/matlab/train.cpp:7:10: fatal error: 'omp.h' file not found
#include <omp.h>
         ^
1 error generated.

    mex: compile of ' "external/liblinear-1.93_multicore/matlab/train.cpp"' failed.

I made sure to change the /openmp option to -fopenmp because I'm on mac.

I asked this question on StackOverflow: http://stackoverflow.com/questions/26724667/openmp-not-available-on-mac-with-gcc-4-9.

spp_demo crashed

I ran "spp_demo" under following conditions:
GTX760 2G memory
GPU mode
"varargin" of "caffe_anysize_test.m" is 1800_1200_3
After executing "[result] = caffe('forward', varargin);", matlab crashed. Does this problem due to insufficient GPU memory (2G)?

How to run SPP on Debian?

Hi,
How to resolve the problem that " there is no file windows.h" and "there is no file tchar.h" when compile spp_build on Debian?
It also can't find the files ".h" when compile caffe downloaded by fetch_caffe_mex_5_5.m in matlab.

Thank you!

Script_spp_voc

Hi,
I ran Script_spp_voc and got some results. I assumed the results I get are the AP (from looking at the variable names in the scripts)? but it is very low. Am I wrong and they are actually error percentage?
screen shot 2014-10-19 at 12 20 33 am

when i run spp_exp_train_and_test_voc(opts), there is a error

when i run spp_exp_train_and_test_voc(opts), there is a error
eval_voc (line 82)
输入参数太多。

出错 spp_test>(parfor body) (line 171)
res(model_ind) = imdb.eval_func(cls, aboxes{model_ind}, imdb, cache_name, suffix, fast);

出错 spp_test (line 169)
parfor model_ind = 1:num_classes, lately ,i found the problem come from this Program in the spp_exp_train_and_test_voc m file ,res_test = spp_test(spp_model, opts.imdb_test, opts.roidb_test, opts.feat_cache_test, '', true), anyone help me??

Is it possible to run SPP on ubuntu?

Hi everyone

I wanna use SPP on ubuntu 12.04.
However when i excute spp_buidl(),a error message appeared

Error using mex
/tmp/mex_2071126706256729_16132/tron.o: In function TRON::trcg(double, double*, double*, double*)': tron.cpp:(.text+0x1da): undefined reference todnrm2_'
tron.cpp:(.text+0x1fd): undefined reference to ddot_' tron.cpp:(.text+0x219): undefined reference todnrm2_'
tron.cpp:(.text+0x252): undefined reference to ddot_' tron.cpp:(.text+0x27a): undefined reference todaxpy_'
tron.cpp:(.text+0x28a): undefined reference to dnrm2_' tron.cpp:(.text+0x2d9): undefined reference todaxpy_'
tron.cpp:(.text+0x2ef): undefined reference to ddot_' tron.cpp:(.text+0x30a): undefined reference toddot_'
tron.cpp:(.text+0x325): undefined reference to ddot_' tron.cpp:(.text+0x3c6): undefined reference todaxpy_'
tron.cpp:(.text+0x3f6): undefined reference to daxpy_' tron.cpp:(.text+0x427): undefined reference todaxpy_'
tron.cpp:(.text+0x43d): undefined reference to ddot_' tron.cpp:(.text+0x460): undefined reference todscal_'
tron.cpp:(.text+0x47a): undefined reference to `daxpy_'

can anyone help me?

My system:
Ubuntu 12.04
Titan black
16 GB
Matlab R2014a

when i run spp_demo, it came with the problem of "API command not recognized"

I run the program on Ubuntu14.04.when I run the spp_demo,i got the problem.

spp_demo
Computing candidate regions...found 2034 candidates (in 8.096s).
Extracting CNN features from regions...Error using caffe
API command not recognized

Error in spp_features_convX (line 61)
caffe('set_gpu_available');

Error in spp_detect (line 40)
feat = spp_features_convX(im, spm_im_size, [], use_gpu);

Error in spp_demo (line 39)
dets = spp_detect(im, spp_model, spm_im_size, use_gpu);
E1223 09:10:39.927554 20714 matcaffe.cpp:393] Unknown command `set_gpu_available'

Anyone can help ?Thanks!

Error using fliplr (line 18) X must be a 2-D matrix.

hi , there is an error when i use Script_spp_voc.m and voc 2007 to train a mode ,i get a error
Error using fliplr (line 18) X must be a 2-D matrix. in imwrite(fliplr(im), flip_image_at(i));
Since function fliplr is for 2-D matrix. , should I skip this sentence by setting parameter "flip" to false? Or should I use another function that is for 3D images? Or must I use gray images?

thanks.

Use the network

Hi,

I want to ask after we train and finetune the model, how do we actually object detection using raw image?
What preprocessing we have to do to feed the image into the network?

At the finetune network definition file,
the input_dim are 128, 12800, 1, 1.
What those dimensions represent respectively?
I understand that 128 is the batch_size, but can't figure out how other numbers come from.

If you can provide a demo to show how to use network to do detection as R-CNN,
that will be very helpful.

Thanks.

ILSVRC models

Do you by any chance have pre learnt models on ILSVRC datasets ?

Trying with eendebakpt Linux Version of Caffe

I have tried with his compiled version of caffe and also the newest version of caffe. But I am getting the same error message when I run the newest version of caffe.
When I run Script_spp_voc, here is aborted message.

WARNING: Logging before InitGoogleLogging() is written to STDERR
[libprotobuf ERROR google/protobuf/text_format.cc:245] Error parsing text-format caffe.NetParameter: 14:14: Message type "caffe.V0LayerParameter" has no field named "conv_param".
F1007 22:58:40.283839 19559 upgrade_proto.cpp:627] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: /mnt/neocortex/scratch/yejiayu/_exp/SPP_net/data/cnn_model/Zeiler_conv5/Zeiler_spm_scale224_test_conv5.prototxt

And it seems you are using V0 version proto but with some V1 features.
I have tried to modify the prototxt to a V1 version, then reading the parameter will be fine but reading the weight file appears to be an error.

What should I do with this?

Anyone has finetuned spp_net successfully on linux?

Anyone has finetuned spp_net successfully on linux?
@ShaoqingRen, Have you finetune spp_net on linux? if you have done it ,Would you like to share the interface of matlab of caffe? Mine is always wrong. I don't know where is problem, for the matlab crashed without more info.Thanks very much.

out of memory error

When running the Script_spp_voc.m script processing goes fine for the first couple of images, but then Matlab crashes. From the command line I can see the following error:

F1012 12:29:18.821614 14792 syncedmem.cpp:156] Check failed: error == cudaSuccess (2 vs. 0) out of memory

Already for the first images almost all of the GPU memory is used (I have a 2 GB card).

Is there a parameter I can set to reduce the memory usage? For r-cnn and Caffe there is often the batch size which can be reduced, but this is not available for SPP as far as I can see.

faster RCNN

Hello , when will you open source faster RCNN, or any schedule? Thanks in advance.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.