Giter Site home page Giter Site logo

sec's People

Contributors

hy9be avatar kolesman avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

sec's Issues

Could you provide the evaluation code?

I just try to reimplement your SEC paper in tensorflow, and everything goes well except for the evaluation.
I converted the SEC.caffemodel your provided, and used a multiscale input to evaluation it. But the final result only reaches 0.49 while it is 0.507 in your paper. So I just wonder if you could upload the evaluation code for this paper?
Thanks ~

Pascal VOC imageset localization fine tune details

Could you provide more details about the localization fine tune on pascal voc?

For example, 10582 image labels for the input_list.txt ? I would be very appreciated if you can provide it, i know it can be extracted from their dataset annotation...but i am lazy :-P

Format of localization cues

Hi,
This is an interesting work. I am confused as to what format the contents of the localization_cues pickle file are in. I understand that datafile['id_labels'] contains a list of present categories, but what about the datafile['id_cues']? I also understand that you compute the foreground localization cues using CAM, but I am not sure about the format it is stored. If it is easy could you please share the code you used for creating the pickle file?

Failed to install CRF

C:\caffe\SEC>pip install CRF/
Processing c:\caffe\sec\crf
Building wheels for collected packages: CRF
Running setup.py bdist_wheel for CRF ... error
Complete output from command c:\users\appdata\local\continuum\anaconda3\python.exe -u -c "import setuptools, tokenize;file='C:\Users\AppData\Local\Temp\pip-req-build-z7mghxx5\setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" bdist_wheel -d C:\Users\AppData\Local\Temp\pip-wheel-y2195p9z --python-tag cp35:
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-3.5
creating build\lib.win-amd64-3.5\krahenbuhl2013
copying krahenbuhl2013\CRF.py -> build\lib.win-amd64-3.5\krahenbuhl2013
copying krahenbuhl2013_init_.py -> build\lib.win-amd64-3.5\krahenbuhl2013
running build_ext
building 'krahenbuhl2013/wrapper' extension
creating build\temp.win-amd64-3.5
creating build\temp.win-amd64-3.5\Release
creating build\temp.win-amd64-3.5\Release\krahenbuhl2013
creating build\temp.win-amd64-3.5\Release\src
C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -Ic:\users\appdata\local\continuum\anaconda3\lib\site-packages\numpy\core\include -Iinclude -I/usr/include/eigen3 -Ic:\users\appdata\local\continuum\anaconda3\include -Ic:\users\21933075\appdata\local\continuum\anaconda3\include "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\INCLUDE" "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.14393.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\include\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.14393.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.14393.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.14393.0\winrt" /EHsc /Tpkrahenbuhl2013/wrapper.cpp /Fobuild\temp.win-amd64-3.5\Release\krahenbuhl2013/wrapper.obj
wrapper.cpp
c:\users\21933075\appdata\local\continuum\anaconda3\lib\site-packages\numpy\core\include\numpy\npy_1_7_deprecated_api.h(12) : Warning Msg: Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION
c:\users\21933075\appdata\local\temp\pip-req-build-z7mghxx5\include\unary.h(28): fatal error C1083: Cannot open include file: 'Eigen/Core': No such file or directory
error: command 'C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe' failed with exit status 2


Failed building wheel for CRF
Running setup.py clean for CRF
Failed to build CRF
Installing collected packages: CRF
Running setup.py install for CRF ... error
Complete output from command c:\users\appdata\local\continuum\anaconda3\python.exe -u -c "import setuptools, tokenize;file='C:\Users\AppData\Local\Temp\pip-req-build-z7mghxx5\setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record C:\Users\21933075\AppData\Local\Temp\pip-record-d2vpk7nm\install-record.txt --single-version-externally-managed --compile:
running install
running build
running build_py
creating build
creating build\lib.win-amd64-3.5
creating build\lib.win-amd64-3.5\krahenbuhl2013
copying krahenbuhl2013\CRF.py -> build\lib.win-amd64-3.5\krahenbuhl2013
copying krahenbuhl2013_init_.py -> build\lib.win-amd64-3.5\krahenbuhl2013
running build_ext
building 'krahenbuhl2013/wrapper' extension
creating build\temp.win-amd64-3.5
creating build\temp.win-amd64-3.5\Release
creating build\temp.win-amd64-3.5\Release\krahenbuhl2013
creating build\temp.win-amd64-3.5\Release\src
C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -Ic:\users\21933075\appdata\local\continuum\anaconda3\lib\site-packages\numpy\core\include -Iinclude -I/usr/include/eigen3 -Ic:\users\appdata\local\continuum\anaconda3\include -Ic:\users\appdata\local\continuum\anaconda3\include "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\INCLUDE" "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.14393.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\include\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.14393.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.14393.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.14393.0\winrt" /EHsc /Tpkrahenbuhl2013/wrapper.cpp /Fobuild\temp.win-amd64-3.5\Release\krahenbuhl2013/wrapper.obj
wrapper.cpp
c:\users\appdata\local\continuum\anaconda3\lib\site-packages\numpy\core\include\numpy\npy_1_7_deprecated_api.h(12) : Warning Msg: Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION
c:\users\appdata\local\temp\pip-req-build-z7mghxx5\include\unary.h(28): fatal error C1083: Cannot open include file: 'Eigen/Core': No such file or directory
error: command 'C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe' failed with exit status 2

Code to generate the localization_cues.pickle

Hi,

Could you please provide the complete code to generate the localization_cues.pickle?

I have tried to use the codes under the weak-localization folder to generate the localization_cues.pickle but can only obtain 49.8% on the PASCAL VOC 2012 val set.

Thank you!

Training

Hi, I have the following error:
/_caffe.so: undefined symbol: _ZN5caffe3NetIfE21CopyTrainedLayersFromERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE
terminate called after throwing an instance of 'boost::python::error_already_set'
*** Aborted at 1636388098 (unix time) try "date -d @1636388098" if you are using GNU date ***
PC: @ 0x7fb7edd7afb7 gsignal
*** SIGABRT (@0x3eb0000165a) received by PID 5722 (TID 0x7fb7f0105200) from PID 5722; stack trace: ***
@ 0x7fb7edd7b040 (unknown)
@ 0x7fb7edd7afb7 gsignal
@ 0x7fb7edd7c921 abort
@ 0x7fb7ee3d1957 (unknown)
@ 0x7fb7ee3d7ae6 (unknown)
@ 0x7fb7ee3d7b21 std::terminate()
@ 0x7fb7ee3d7da9 __cxa_rethrow
@ 0x7fb7ef523aef caffe::GetPythonLayer<>()
@ 0x7fb7ef643300 caffe::Net<>::Init()
@ 0x7fb7ef645e4e caffe::Net<>::Net()
@ 0x7fb7ef650836 caffe::Solver<>::InitTrainNet()
@ 0x7fb7ef650e14 caffe::Solver<>::Init()
@ 0x7fb7ef6510ff caffe::Solver<>::Solver()
@ 0x7fb7ef66e831 caffe::Creator_SGDSolver<>()
@ 0x55cb7c8cdf70 train()
@ 0x55cb7c8ca37b main
@ 0x7fb7edd5dbf7 __libc_start_main
@ 0x55cb7c8cad6a _start

but caffe is correctly installed with WITH_PYTHON_LAYER := 1. I compiled using the commands:

make all -j6 # 6 represents number of CPU Cores
make pycaffe -j6 # 6 represents number of CPU Cores

Can you help me? Thanks

Installation issue: libXdmcp.so.6: cannot open shared object file

When I follow the installation guide you provided and try to run a prediction with:
python demo.py --model SEC.caffemodel --image ~/1475186965759787059.jpg --smooth --output ~/1475186965759787059_result_sec.png
I get the following error:
...
from PyQt4 import QtCore, QtGui
ImportError: libXdmcp.so.6: cannot open shared object file: No such file or directory

training error

hi,
im trying to train the network. im getting the following error. any idea how to fix it?

Couldn't import dot_parser, loading of dot files will not be possible.
Traceback (most recent call last):
File "/home/mlv/object/SEC/pylayers/pylayers/init.py", line 1, in
from .pylayers import *
File "/home/mlv/object/SEC/pylayers/pylayers/pylayers.py", line 11, in
from krahenbuhl2013 import CRF
File "/home/mlv/object/SEC/CRF/krahenbuhl2013/init.py", line 1, in
from .CRF import *
File "/home/mlv/object/SEC/CRF/krahenbuhl2013/CRF.py", line 1, in
from krahenbuhl2013.wrapper import DenseCRF
ImportError: No module named wrapper

How to Link Eigen, cannot install crf wrapper, through pip install CRF/

I have downloaded and tested eigen3 on my instance.

However while pip installing the CRF/ wrapper, I get.

krahenbuhl2013/../include/unary.h:28:22: fatal error: Eigen/Core: No such file or directory compilation terminated. error: command 'gcc' failed with exit status 1

How are you linking the eigen, source_directory or build directory with the CRF wrapper ?

Train localization cues for different datasets

Hi,
I was wondering whether you could make the files available to train the localization cues on different datasets. Specifically I assume that to create the deploy.prototxt you:

  • Removed Dropout
  • Removed the annotations
  • Removed the loss
  • And the solver.prototxt was identical to this one

It would also be great if you could share the script to create the .pickle files.

Recommended caffe version? / Incompatible protobufs

Hi,
could you indicate which version of caffe you have been using this with?
I have just used the latest version of caffe-master, but somehow the protobufs seem to be incompatible.

Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/pylayers/__init__.py", line 1, in <module>
    from .pylayers import *
  File "/usr/local/lib/python2.7/dist-packages/pylayers/pylayers.py", line 1, in <module>
    import caffe
  File "/home/holger/caffe/python/caffe/__init__.py", line 1, in <module>
    from .pycaffe import Net, SGDSolver, NesterovSolver, AdaGradSolver, RMSPropSolver, AdaDeltaSolver, AdamSolver, NCCL, Timer
  File "/home/holger/caffe/python/caffe/pycaffe.py", line 15, in <module>
    import caffe.io
  File "/home/holger/caffe/python/caffe/io.py", line 8, in <module>
    from caffe.proto import caffe_pb2
  File "/home/holger/caffe/python/caffe/proto/caffe_pb2.py", line 11, in <module>
    from google.protobuf import descriptor_pb2
  File "/usr/local/lib/python2.7/dist-packages/google/protobuf/descriptor_pb2.py", line 213, in <module>
    serialized_end=3897,
  File "/usr/local/lib/python2.7/dist-packages/google/protobuf/descriptor.py", line 602, in __new__
    return _message.default_pool.FindEnumTypeByName(full_name)
KeyError: "Couldn't find enum google.protobuf.FieldOptions.JSType"

Global Weight average pooling in pytorch

Hello All,
Could you please explain how to apply Global Weight rank pooling in pytorch. For your refernce the article which discussed is mentioned as below. Thank you

@inproceedings{kolesnikov2016seed,
title={Seed, Expand and Constrain: Three Principles for Weakly-Supervised Image Segmentation},
author={Kolesnikov, Alexander and Lampert, Christoph H.},
booktitle={European Conference on Computer Vision ({ECCV})},
year={2016},
organization={Springer}

Some questions about training which could be helpful for all

Hello,

I appreciate if you reply these questions. at least nobody will ask these on future.

  1. I have created a small dataset on VOC format and I want to train it using a pre-trained model. I should mention that number of classes are two. What should I do step by step.

  2. I have a single 6G GPU. can fine-tuning be done on this?

  3. How can I test the new model on an image, video or video stream (webcam or similar)?

  4. How can I reduce the false detection rate?

Pretrained Caffe Model

Hello,

Thank you for your complete instruction for guiding to perform semantic segmentation. Following the idea of SEC, I am trying to implement it using Pytorch. However, when I try to download using "wget" the provided pretrained weight, the "403 forbidden" error occur. I am wondering is there anyway I can download the pretrained model just for my research purpose?

protobuf2.6.1

Can I run SEC in protobuf2.6.1 ๏ผŒ what should I do?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.