Giter Site home page Giter Site logo

yue-fan / cossl Goto Github PK

View Code? Open in Web Editor NEW
50.0 50.0 13.0 6.29 MB

Official PyTorch Implementation of "CoSSL: Co-Learning of Representation and Classifier for Imbalanced Semi-Supervised Learning" (CVPR 2022)

License: MIT License

Python 97.08% Shell 2.92%

cossl's People

Contributors

yue-fan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

cossl's Issues

can not reproduce experimental process.

From the code you provided, your experimental process cannot be reproduced, and there is a data processing error using pytorch version 1.0.0 in the first step of data processing. "python train_cifar_fix.py --ratio 2 --num_max 1500 --imb_ratio_l 150 --imb_ratio_u 150 --epoch 500 --val-iteration 500 --out ./results/cifar10/fixmatch/baseline/wrn28_N1500_r150_seed1 --manualSeed 1 --gpu 2"

About the reproduce problem

Hi. Thanks for the great work here.

I have some confusion when I try to reproduce your work.Form the training examples,it means we must to pretrain a module for cossl training,what if pre-training is not applied?Can we directly us a Fixmatch pretrain model to resume without running pretrain phase?And where is the two-phase training explanation in paper?your framework is still a two-stage method?

Hope to reveive your reply.

Accuracy dropping in the second stage.

Thank you for sharing your code,

By running your experiment, we find the following phenomena:

For the first stage training on CIFAR10-100-30 (FixMatch), the accuracy can only reach 73% on epoch 400. While running your second stage code, for the first epoch, the accuracy can reach 83% already (the reported Acc is 83.3%), but the final accuracy dropped to around 82%. The very simimlar trend happened for other settings such as CIFAR10-150-30 (FixMatch) and CIFAR10-10-30 (ReMixMatch). So, is it a normal phenomena? or is it necessary to run your second stage code for 100 epochs?

Here is the log:

First stage:
0.301168 0.004345 0.296823 0.896344 0.939906 0.972318 1.681776 73.340000 0.695729
0.295873 0.002372 0.293501 0.903563 0.942219 0.972020 1.717008 72.740000 0.686277
0.301200 0.002995 0.298205 0.902875 0.939906 0.972518 1.720673 72.830000 0.685875
0.301357 0.004372 0.296985 0.898281 0.940063 0.972169 1.692884 73.210000 0.690956
0.297790 0.003786 0.294004 0.902250 0.940250 0.971391 1.658857 73.270000 0.692580
0.294487 0.003274 0.291213 0.900969 0.941625 0.972495 1.689642 72.900000 0.686115
0.298541 0.004243 0.294298 0.901719 0.939562 0.972033 1.692589 72.940000 0.687492
0.296468 0.002795 0.293673 0.908281 0.941656 0.972063 1.644016 73.480000 0.696454
0.302739 0.003770 0.298968 0.904625 0.940844 0.972122 1.664308 72.900000 0.686663
0.295059 0.003456 0.291604 0.903969 0.940937 0.972344 1.689525 73.120000 0.686466
0.294181 0.003069 0.291112 0.903469 0.940875 0.971602 1.726026 72.820000 (Acc) 0.682684 (epoch 400)

Second stage:
Train Loss Train Loss X Train Loss U Train Loss Teacher Mask Total Acc. Used Acc. Teacher Acc. Test Loss Test Acc.
0.340341 0.003496 0.336845 0.575207 0.928594 0.948562 0.973448 0.822469 0.591990 83.360000
0.386872 0.008528 0.378344 0.515435 0.933406 0.949031 0.971911 0.834375 0.631811 82.910000
0.402782 0.011959 0.390823 0.522238 0.930219 0.948625 0.972486 0.834375 0.655660 82.930000
0.413216 0.012161 0.401055 0.526547 0.929000 0.948375 0.972484 0.830688 0.683735 82.700000
0.415839 0.012906 0.402933 0.528317 0.928312 0.948531 0.972026 0.830187 0.699839 82.810000
0.423619 0.011258 0.412361 0.541394 0.928281 0.948281 0.971621 0.825500 0.718995 82.660000
0.429311 0.013463 0.415849 0.555226 0.924250 0.948125 0.971768 0.820219 0.730391 82.640000
0.433773 0.015663 0.418111 0.545593 0.920750 0.945375 0.971389 0.823125 0.731047 82.970000
0.435293 0.014411 0.420881 0.563777 0.920594 0.945375 0.970875 0.815937 0.744796 82.850000
0.433798 0.012307 0.421492 0.559605 0.922719 0.945500 0.970434 0.816500 0.747746 82.800000
0.442264 0.014329 0.427935 0.562458 0.922125 0.944688 0.970415 0.814844 0.760292 82.800000
0.441064 0.014538 0.426527 0.566264 0.923594 0.945125 0.969176 0.814469 0.759005 82.820000
0.449507 0.016262 0.433245 0.565209 0.925156 0.944875 0.969059 0.813844 0.760997 82.930000
0.442113 0.014984 0.427129 0.572219 0.924219 0.944406 0.969434 0.810406 0.773561 82.840000
0.438947 0.012537 0.426410 0.570746 0.921625 0.943969 0.968500 0.811937 0.776677 82.850000
0.435629 0.011988 0.423641 0.583040 0.925000 0.943781 0.968649 0.806125 0.778590 82.880000
0.448445 0.014712 0.433733 0.572399 0.920906 0.943469 0.968238 0.811937 0.783862 82.750000
0.451621 0.014544 0.437078 0.584219 0.920031 0.942656 0.967970 0.808719 0.770416 82.770000
0.447725 0.012515 0.435210 0.580067 0.922719 0.942937 0.967962 0.807906 0.777152 82.680000
0.443247 0.013194 0.430054 0.584696 0.919719 0.941562 0.967517 0.805625 0.771047 82.770000
0.447812 0.013839 0.433973 0.585420 0.919844 0.940813 0.966842 0.806125 0.749438 82.990000
0.448081 0.011850 0.436231 0.579196 0.919844 0.941250 0.966842 0.808875 0.760295 82.970000
0.454339 0.014338 0.440002 0.594275 0.918844 0.939438 0.966772 0.803406 0.747383 82.990000
0.453589 0.014051 0.439538 0.587974 0.920687 0.940500 0.966907 0.804250 0.754866 83.000000
0.453827 0.013399 0.440428 0.596214 0.920937 0.940844 0.966440 0.801656 0.759223 82.790000
0.446133 0.012569 0.433564 0.587262 0.920406 0.941031 0.965844 0.807438 0.761329 82.810000
0.453675 0.014042 0.439632 0.583411 0.921719 0.940406 0.964435 0.806125 0.767524 82.990000
0.457595 0.015435 0.442160 0.592732 0.919281 0.940406 0.965258 0.805312 0.752987 83.120000
0.452636 0.013724 0.438911 0.589865 0.922406 0.939781 0.965003 0.801375 0.751004 83.190000
0.449950 0.011712 0.438238 0.591601 0.919031 0.939719 0.965521 0.802344 0.755927 83.130000
0.454132 0.013091 0.441041 0.602900 0.924719 0.940531 0.964956 0.800719 0.762240 82.790000
0.450245 0.013006 0.437238 0.596008 0.925375 0.940531 0.964744 0.803469 0.761952 82.930000
0.470755 0.014885 0.455870 0.596988 0.925937 0.940562 0.964462 0.800781 0.776346 82.530000
0.447610 0.012008 0.435602 0.593819 0.926156 0.940500 0.963458 0.803000 0.785175 82.460000
0.454318 0.012133 0.442185 0.599972 0.927719 0.940250 0.963418 0.802438 0.789765 82.180000
0.448311 0.011119 0.437192 0.601203 0.925438 0.939625 0.963936 0.800937 0.794356 82.340000
0.462163 0.014101 0.448062 0.597963 0.927969 0.939719 0.963866 0.802156 0.794788 82.080000
0.460039 0.011914 0.448125 0.601799 0.929188 0.940219 0.963342 0.801500 0.800457 82.220000
0.457303 0.012526 0.444776 0.601704 0.927469 0.940625 0.963408 0.800125 0.808433 82.170000
0.454704 0.011692 0.443012 0.587907 0.929312 0.939688 0.963246 0.805656 0.804490 82.320000
0.451604 0.012016 0.439589 0.593933 0.927844 0.939781 0.962918 0.802469 0.799956 82.390000
0.453899 0.011478 0.442421 0.604976 0.927063 0.940187 0.963932 0.798438 0.805524 82.470000
0.450515 0.011658 0.438857 0.598755 0.928531 0.939906 0.962878 0.803531 0.805338 82.620000
0.458326 0.012618 0.445708 0.600216 0.928344 0.938969 0.962702 0.801687 0.815174 82.570000
0.451074 0.011571 0.439502 0.602393 0.927531 0.939219 0.962131 0.798531 0.828527 82.210000
0.453072 0.011113 0.441959 0.604168 0.924781 0.938531 0.961815 0.800281 0.824624 82.160000
0.454657 0.011541 0.443116 0.593165 0.927719 0.938250 0.961532 0.802969 0.824115 82.330000
0.453705 0.011767 0.441937 0.590361 0.928125 0.938500 0.961279 0.803844 0.818967 82.430000
0.456550 0.011688 0.444862 0.607690 0.930500 0.939719 0.961378 0.795594 0.820793 82.160000
0.452924 0.010844 0.442080 0.603745 0.928219 0.939000 0.961317 0.796719 0.814463 82.180000
0.458363 0.013534 0.444829 0.602741 0.926656 0.937375 0.960982 0.797625 0.806718 82.450000
0.450761 0.010124 0.440637 0.600629 0.926094 0.937594 0.960857 0.799687 0.812212 82.290000
0.463295 0.012741 0.450554 0.603673 0.926906 0.937063 0.960891 0.798344 0.820098 82.330000
0.460218 0.010880 0.449338 0.614476 0.930312 0.938219 0.960329 0.793625 0.826927 82.140000
0.461101 0.011399 0.449702 0.605098 0.927344 0.937312 0.960472 0.798406 0.830288 82.190000
0.458203 0.012320 0.445883 0.606587 0.929969 0.939000 0.960718 0.797250 0.827203 82.020000
0.451464 0.010127 0.441338 0.606548 0.926156 0.937094 0.960421 0.797063 0.830043 82.080000
0.457141 0.010009 0.447132 0.589182 0.929656 0.937656 0.960604 0.803750 0.841645 81.950000
0.458767 0.011321 0.447447 0.603738 0.927063 0.937875 0.960123 0.797000 0.836994 82.050000
0.454558 0.009118 0.445440 0.605655 0.930813 0.936125 0.959746 0.798594 0.840237 81.920000
0.453940 0.010890 0.443050 0.614996 0.929844 0.935906 0.958898 0.797531 0.836860 82.060000
0.450820 0.010272 0.440548 0.609532 0.931312 0.936937 0.959432 0.796937 0.837213 82.280000
0.456952 0.010680 0.446273 0.595562 0.930187 0.937562 0.960357 0.800469 0.826890 82.170000
0.455839 0.009804 0.446035 0.606575 0.930438 0.937094 0.959058 0.797406 0.821094 82.230000
0.453589 0.009900 0.443689 0.597375 0.930750 0.936594 0.959676 0.799000 0.811402 82.480000
0.455734 0.011329 0.444404 0.599060 0.929375 0.935781 0.959516 0.800094 0.817019 82.500000
0.455482 0.009801 0.445682 0.599984 0.929656 0.936594 0.958352 0.800406 0.822199 82.410000
0.456596 0.009741 0.446855 0.603422 0.929750 0.935031 0.958389 0.797375 0.820201 82.650000
0.454195 0.011235 0.442960 0.600474 0.929750 0.936312 0.958423 0.799656 0.818402 82.660000
0.460074 0.011608 0.448466 0.600283 0.932250 0.936406 0.958501 0.796687 0.814149 82.760000
0.453294 0.009251 0.444043 0.615393 0.931750 0.937000 0.958277 0.794063 0.819465 82.670000
0.461744 0.011400 0.450344 0.602313 0.934625 0.936688 0.958573 0.800031 0.827683 82.600000
0.450101 0.010814 0.439287 0.607999 0.932187 0.937844 0.958632 0.798750 0.818935 82.390000
0.454595 0.008805 0.445791 0.614396 0.932344 0.937063 0.958673 0.795781 0.815102 82.540000
0.456146 0.010353 0.445793 0.598767 0.931719 0.936813 0.957840 0.798656 0.821032 82.340000
0.461644 0.012627 0.449017 0.609026 0.931813 0.936688 0.958113 0.796156 0.810854 82.350000
0.452948 0.010476 0.442472 0.594348 0.929219 0.936094 0.958399 0.802031 0.809256 82.450000
0.453002 0.010267 0.442735 0.611528 0.931031 0.936281 0.958782 0.793594 0.809285 82.380000
0.452882 0.009092 0.443789 0.606548 0.931281 0.935937 0.958156 0.796469 0.823924 82.440000
0.455693 0.010420 0.445274 0.619858 0.930813 0.936438 0.957665 0.793656 0.816204 82.590000
0.456282 0.011277 0.445006 0.601405 0.933031 0.936187 0.957296 0.800781 0.817498 82.620000
0.456068 0.010840 0.445228 0.599826 0.929719 0.936219 0.957682 0.799406 0.815235 82.440000
0.448079 0.010436 0.437642 0.612777 0.930562 0.935344 0.956276 0.794937 0.812249 82.530000
0.458141 0.009672 0.448469 0.614246 0.931688 0.936344 0.956463 0.794469 0.822559 82.000000
0.458813 0.009300 0.449513 0.607858 0.933500 0.936750 0.956548 0.797312 0.821129 82.240000
0.455385 0.010424 0.444961 0.605785 0.933750 0.938156 0.957831 0.797906 0.803650 82.430000
0.451509 0.010414 0.441095 0.601163 0.933125 0.936875 0.956296 0.799188 0.802344 82.480000
0.454782 0.010240 0.444542 0.612834 0.932250 0.936344 0.956590 0.795281 0.802211 82.480000
0.454310 0.009726 0.444584 0.613304 0.931250 0.935156 0.956544 0.793687 0.798848 82.420000
0.459840 0.010492 0.449349 0.601623 0.935656 0.936219 0.956314 0.801687 0.805371 82.360000
0.460324 0.010277 0.450047 0.600895 0.933125 0.936125 0.957334 0.801500 0.795448 82.780000
0.461058 0.010952 0.450106 0.612254 0.936656 0.936875 0.956928 0.794250 0.794249 82.830000
0.446511 0.008885 0.437626 0.607017 0.933969 0.936187 0.956302 0.796312 0.794112 82.660000
0.464778 0.011738 0.453040 0.608275 0.935125 0.936156 0.956256 0.798063 0.795270 82.660000
0.464612 0.010793 0.453819 0.606017 0.934469 0.934906 0.955222 0.798312 0.807012 82.210000
0.454953 0.009790 0.445163 0.611463 0.931531 0.933937 0.955752 0.796375 0.829200 82.060000
0.461756 0.010506 0.451250 0.601682 0.933688 0.934500 0.955686 0.799281 0.826113 82.180000
0.457916 0.009679 0.448237 0.614798 0.936406 0.936125 0.955849 0.794344 0.825772 82.290000
0.458776 0.011607 0.447169 0.617150 0.932375 0.935031 0.955155 0.793656 0.803494 82.480000
0.450981 0.009319 0.441661 0.606260 0.931750 0.935000 0.955427 0.798125 0.818881 82.000000

small imagenet127

Hi,

Could you also share the codes on how you preprocess dataset small_imagenet127

Many thanks,
xyk

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.