Comments (6)
Yes, It is the same, but there is a discrepancy of precision between the 2 framework, the embedding of the feature output(512D) should not have a big difference.
from facenet-caffe.
why do my two features(fea_tf and fea_caffe ) have a few difference
from facenet-caffe.
this is the D-value(fea_tf matrix - fea_caffe matrix):
result: [ 0.0279929 0.00873665 -0.0157224 0.02938854 -0.06059439 0.11609787
-0.06976449 0.03629776 -0.06149287 0.00170472 0.14634752 0.07725966
0.09093025 -0.00308898 -0.06604764 0.06349037 0.05536365 0.11500727
-0.10018672 -0.05139054 -0.04758895 -0.01037429 0.06577706 -0.06179708
-0.07531317 -0.0039532 -0.07568493 -0.02008125 -0.0418139 0.13808618
0.11878026 -0.01137355 0.03561588 0.04901341 -0.08107542 -0.06242167
0.06912767 -0.08871368 -0.03641796 0.06377206 -0.00592435 -0.05722718
0.02351412 0.15766686 0.03248164 0.03820618 0.03532988 0.08291396
-0.03328522 -0.02438381 -0.09475721 -0.01230818 0.00162978 -0.08353224
0.02902994 0.07887293 -0.05164755 -0.08894886 0.00136318 0.00426575
0.00198088 0.18585905 -0.10638526 0.01505883 -0.01936317 0.0196936
-0.06845933 0.0133239 0.03869073 0.00431218 0.04426096 0.13810582
-0.07546277 0.02436341 0.02374257 -0.04168963 -0.11732984 -0.04722159
0.0026176 -0.1401459 -0.00126939 -0.08908233 -0.00599906 0.03233561
0.00340613 -0.03821972 -0.01320491 0.07361589 0.02450724 -0.14110452
0.00581889 -0.10522359 -0.02493175 0.01568787 0.0988172 -0.01805686
-0.01111456 -0.03565313 0.03071858 0.14815015 0.09873323 0.11734572
0.02433824 0.08343159 0.01334668 0.03642815 -0.09556962 -0.15428008
-0.04039212 -0.02220548 0.06443397 0.14992149 -0.07813921 -0.03317379
-0.00745957 0.05942711 0.0415392 0.02184947 0.0184488 0.02948991
0.08485538 0.05239445 -0.05312381 -0.06769364 -0.0534694 0.02807372
0.0027022 0.04141714 0.141296 -0.08546629 0.04488822 -0.04705779
-0.01209944 0.05164552 -0.00337794 -0.06125275 -0.14202574 -0.13222681
0.10970768 0.11440851 -0.04685125 0.07265764 -0.05695726 -0.07133057
0.07868938 -0.04346891 0.02588977 -0.02053742 0.03330803 0.06236999
0.08427566 -0.127574 0.05717603 0.04369361 0.11151528 -0.01548939
-0.07210784 -0.04653572 0.00352714 -0.13886563 -0.03700655 0.04731353
-0.11780946 0.11919434 0.10043424 0.05145653 0.02503818 0.05481247
-0.125419 -0.02519303 -0.01716611 0.11137995 -0.04199541 0.16357519
0.04267142 0.07490764 0.02786589 0.14146116 -0.00681323 0.04377376
0.01806981 -0.10025909 0.08826214 0.04165353 0.10969736 0.02109899
0.10773781 0.10785908 0.05600353 -0.09640294 0.06435123 -0.00895559
0.10374393 -0.10335924 0.03759406 -0.03891125 0.00252803 0.06165926
0.04972216 0.04886426 0.05394054 -0.02635846 0.1126359 -0.14393382
0.06194483 0.0597382 -0.03942973 0.03051949 0.03079376 -0.07591378
-0.06338858 0.01135775 0.11429122 0.10195178 0.0307379 0.01328483
0.01747341 -0.04178599 0.02975645 -0.06115339 -0.05912264 -0.04622356
0.01302906 -0.00068229 0.04440828 -0.02821304 -0.02180836 -0.0410088
-0.00590983 0.01716548 -0.0131952 0.08577591 -0.05604427 -0.06879582
0.011664 -0.01894493 0.01983536 0.08131946 0.08747307 -0.00973472
0.01318846 0.10365895 0.11149615 0.00335191 0.00988697 0.02184044
0.08687351 0.11318494 -0.05623203 0.06989266 -0.05369198 -0.04048555
0.0818973 0.02466857 -0.01052385 -0.04065187 0.06708683 0.00914953
0.09427166 0.06618121 -0.03816329 -0.10710724 0.04919989 0.00281875
-0.06828211 -0.122821 0.03263432 -0.03930826 -0.00076323 -0.15917246
0.0734871 0.01027622 0.11719804 0.00776954 -0.05392749 0.00285433
0.059232 -0.0177759 -0.01906815 -0.1062744 0.05284479 -0.0333961
-0.02680271 -0.02186522 0.01933245 -0.01711252 -0.00892984 -0.11820317
-0.03343207 0.00144756 -0.05585464 -0.01235992 -0.10874947 0.13409044
0.04128192 -0.01675532 -0.0942947 -0.10616492 0.07126907 0.01391621
0.10609715 -0.00850886 0.07501094 -0.03206721 0.08237871 0.06541435
-0.00381759 -0.02035698 -0.08762591 0.10613266 0.03410246 -0.0316227
0.05058284 0.01200349 0.10508189 -0.0427785 -0.10632229 0.03103887
0.01643478 -0.00276636 0.00372653 -0.07437549 -0.1268318 0.04978336
-0.05079244 0.08892213 0.05540005 0.08597943 0.10364556 0.0248923
0.06639552 -0.0640701 -0.16022167 0.03363867 0.03125863 0.06264352
0.06997093 -0.00545098 -0.05192859 0.09219741 0.00520553 -0.10460947
0.13234806 0.05924877 -0.04698969 0.0348469 0.03053119 -0.03494171
0.0628388 0.02207169 0.02232435 -0.0050515 0.07238242 0.01112456
0.09282323 -0.00783481 0.07717082 -0.0535277 -0.05071675 -0.11035065
-0.13716115 -0.08197283 -0.10930265 -0.10215415 0.01927901 0.07963534
-0.04516111 0.06188687 -0.13690296 0.06722295 0.01396949 -0.09107953
0.00930572 -0.04261395 0.02027321 -0.00339025 0.04550293 0.00518939
0.04562241 -0.02628011 -0.06167176 -0.12071544 -0.05616446 0.0295569
-0.0105841 0.00472887 0.07754687 -0.10481101 -0.03781217 -0.04966674
0.02016702 0.08310849 -0.04744826 0.03321435 -0.0015615 0.08705556
0.0469837 0.07775217 0.01848718 -0.05713877 0.05341021 0.11235222
0.1550166 0.12813966 0.00610386 -0.09099633 -0.03296 0.01869269
-0.0320418 0.01880379 0.04723762 -0.04980495 0.05230247 0.0453691
0.0066226 -0.0172446 0.12138501 0.05590507 0.17232463 -0.05219802
0.01327718 -0.01022409 -0.11913359 -0.03308057 -0.0089714 0.02950186
0.07785319 0.02731751 0.00038546 0.05018977 0.0106928 0.0833675
-0.11972595 0.01081952 0.06567862 0.11174795 0.04442918 0.06188258
-0.03868107 0.02478879 -0.07321848 0.0194521 -0.03878763 -0.00038387
0.05106924 -0.01454129 -0.02140814 -0.08151387 0.01031941 -0.05395797
-0.03334396 0.0493455 0.10942582 0.05147921 -0.00572009 -0.03331836
-0.08417642 0.05395478 -0.04814875 0.04511426 0.01856398 0.06457835
0.08506191 -0.00476467 0.01826901 -0.0461903 0.01463338 -0.05200292
0.01504715 -0.01577496 0.11224364 -0.00612184 0.06808157 -0.04572248
-0.17151853 -0.13515484 0.01795935 -0.07021879 0.02938909 0.06101551
0.00249073 -0.01315597 -0.00741528 0.06427597 -0.04855761 0.09330046
-0.12033431 -0.01578407 0.0399746 -0.09657503 -0.03877673 -0.04391157
-0.08891184 -0.01750039 0.0819832 -0.11521423 0.01890038 0.02289248
0.05028534 0.01614029 -0.06147645 -0.03770247 0.05363531 -0.09277709
-0.13082252 0.10403828 0.02131865 0.13899687 0.04724831 -0.00130092
0.00812758 -0.01532461]
from facenet-caffe.
function convertTf2Caffe()
feed a random [1,160,160,3] matrix, the diff of what I running is below:
-1.89851969e-04 1.86164398e-05 2.97982246e-04 1.29725784e-04
5.90578653e-04 -6.23032451e-04 -4.78558242e-04 -4.49642539e-05
1.08458847e-03 -9.73582268e-04 1.91682018e-04 -9.27820802e-05
3.12425196e-04 -2.71603465e-04 1.55901304e-04 2.01996416e-04
1.60138123e-04 9.74684954e-05 2.23956071e-04 6.18286431e-05
1.51652843e-04 -3.92165035e-04 -2.13630497e-04 -1.82855874e-04
-1.57237053e-04 -3.14619392e-04 -1.01815909e-04 3.76314856e-05
-7.30343163e-05 3.00955027e-04 -6.09584153e-04 7.98869878e-05
-2.83484580e-04 -2.08579004e-04 8.94232653e-06 2.83626840e-04
-2.58523971e-04 -1.61104836e-05 -6.58348203e-04 5.81356697e-04
6.44121319e-05 2.92383134e-04 -1.91170722e-04 -2.48350203e-04
-3.31748277e-04 2.46209092e-04 2.66239047e-04 3.39705497e-04
-5.57646155e-04 -3.15282494e-04 5.81927598e-05 -1.12537295e-04
1.18624419e-04 5.14023472e-04 1.22848898e-04 6.62803650e-05
4.53667715e-04 9.06139612e-05 1.78886577e-04 -1.62291341e-04
-3.95806506e-04 -1.88946724e-04 -1.07139349e-04 1.77234411e-04
-1.17751770e-05 5.54617494e-04 -5.88871539e-04 7.68974423e-05
-4.58270311e-04 -6.11945987e-04 -2.45980918e-05 -3.45706940e-04
1.48577616e-04 -6.23924658e-04 4.85457014e-04 -5.22557646e-04
-2.13913620e-04 5.14212996e-04 -3.03378329e-04 2.10764818e-04
1.41393393e-04 -9.79918987e-05 1.23560429e-04 -6.82994723e-05
2.64683738e-04 -1.70487911e-04 -1.59271061e-04 2.89883465e-04
1.91453844e-04 6.57331198e-04 3.66944820e-04 9.33540519e-04
4.66089696e-04 -2.85059214e-04 -1.39594078e-04 6.93800393e-06
-1.82744116e-05 5.62409929e-04 -1.89736485e-04 -1.69657171e-04
-3.72858718e-04 -1.47397630e-04 -4.50849533e-04 3.72387469e-04
3.23635526e-04 -8.62032175e-06 6.63965009e-04 2.37736851e-04
1.41926110e-04 7.08073378e-04 -4.40292060e-05 2.25253403e-04
-3.07828188e-04 4.42441553e-04 -3.09793279e-04 1.72751024e-04
-4.35784459e-05 4.29314561e-04 -2.46664509e-04 1.27121806e-04
-3.50024551e-04 -2.15020031e-04 4.52883542e-05 -3.11601907e-04
7.26543658e-04 -5.73769212e-05 3.95302661e-04 -4.96229157e-04
-5.11351973e-04 -2.05151737e-04 -3.53854150e-04 2.44520605e-04
4.97717410e-04 -1.96784735e-04 2.99945474e-04 9.81733203e-04
8.36234540e-05 -5.94528392e-05 -3.52296978e-04 -3.66933644e-04
6.07818365e-05 -3.03553883e-04 -8.38129316e-04 4.92874533e-04
5.78515232e-04 -4.04285267e-04 -2.62431800e-04 4.06417996e-04
3.41348350e-05 -5.08986413e-05 4.09599394e-04 -1.40190125e-04
-2.96935439e-04 2.71141529e-04 1.12600625e-04 2.22383067e-04
2.10779719e-04 1.32046640e-04 4.74974513e-05 3.90492380e-04
4.18651849e-04 -1.69057399e-04 -2.38161534e-04 3.09459865e-05
-3.74453142e-04 -1.32516026e-04 -7.20135868e-05 1.62929296e-04
-1.44194812e-04 -9.14707780e-05 -1.75312161e-05 -2.07941048e-04
2.40668654e-04 -3.16459686e-04 1.99855771e-04 1.59488991e-04
3.38703394e-04 -6.20095059e-04 -4.93519008e-04 -4.74154949e-04
2.23240815e-04 3.22539359e-04 -7.30345026e-04 6.98789954e-05
-2.11581588e-04 8.23880779e-04 -3.72409821e-04 3.56096774e-04
1.33514404e-05 3.53984535e-04 -1.93167478e-04 -2.81427056e-05
1.42395496e-04 -1.37954950e-04 1.96158886e-04 -5.45121729e-05
-2.62022950e-05 1.74932182e-04 -2.60232016e-04 -5.57340682e-04
1.82842836e-04 -4.63626580e-04 2.34778970e-04 2.05028802e-04
-1.93120912e-04 1.82043761e-04 1.05723739e-05 9.71257687e-05
6.94871880e-04 5.33259474e-04 1.66788697e-04 9.30055976e-05
-5.10644168e-05 -1.85979530e-04 -1.04717910e-04 -4.06759791e-04
-3.72700393e-04 4.90684062e-04 -2.52127647e-05 -1.80631876e-04
-1.76142901e-04 -1.79574825e-04 -3.37308273e-04 -4.45853919e-04
7.45896250e-05 2.71636993e-04 -3.16612422e-04 5.27057331e-04
-6.26478344e-04 3.17085534e-04 -4.46993858e-04 -3.75978649e-04
-1.47148967e-04 -2.26631761e-04 4.05117869e-04 1.24990009e-04
-2.74427235e-04 8.46944749e-05 -1.85986515e-04 3.08431685e-04
-8.47931951e-05 2.20995396e-04 1.49480999e-04 6.00144267e-06
-2.78379768e-04 4.15313989e-04 -4.07076906e-04 5.10737300e-06
1.47017650e-04 -3.81246209e-05 3.17823142e-04 1.50665641e-04
-2.45124102e-05 -4.44232486e-04 2.75266357e-04 7.63728283e-04
-2.21014023e-04 -2.01687217e-05 8.85333866e-05 5.95301390e-06
-7.59709626e-04 -1.39511190e-04 6.30803406e-05 2.33479775e-04
9.94503498e-05 -4.87342477e-05 -9.76510346e-05 1.16935931e-04
-6.40206970e-04 1.32256187e-04 -8.20953399e-04 -2.92474870e-04
1.04770530e-04 -3.26529145e-04 9.58517194e-05 -1.31957233e-04
1.13395974e-04 -1.34527683e-04 -1.16070732e-04 4.70355153e-05
-4.20454890e-04 7.21037388e-04 2.84038484e-04 1.53083354e-04
-3.25448811e-04 3.53241805e-04 -4.10452485e-05 -6.09327108e-05
2.40704045e-04 -2.57350504e-04 1.37360767e-04 -8.45938921e-05
3.16262245e-04 -1.92165375e-04 3.83784994e-04 -3.13244760e-04
1.15516782e-03 -1.81406736e-04 1.49431638e-04 -3.37604433e-07
-4.42683697e-04 4.35424969e-04 -2.62759626e-04 1.01782382e-04
-3.36691737e-04 4.48327512e-04 3.44660133e-04 -3.29338014e-04
4.35754657e-04 -6.04040921e-04 -4.62977216e-04 4.14147973e-04
2.74451450e-04 2.15694308e-05 2.43343413e-04 -1.35820359e-04
2.85107642e-04 1.57961622e-04 5.28655946e-04 3.02288681e-05
-2.10002065e-04 1.63779594e-04 1.54960901e-04 1.88812613e-04
9.98247415e-05 -2.98704486e-04 4.74549830e-04 8.35319050e-04
-2.36183405e-06 3.53006646e-04 -2.27933750e-04 2.16826797e-04
6.57292083e-04 -3.40048224e-04 -3.02297529e-04 -1.79000199e-05
-1.15111470e-06 -2.39321031e-04 3.69183719e-04 -8.73189420e-05
-3.03659588e-04 -4.04939055e-06 -5.28287143e-04 8.34278762e-06
3.66732478e-04 -1.71206892e-04 2.13734806e-04 7.74897635e-04
-3.32750380e-04 -2.40117311e-04 3.70567665e-04 4.75056469e-04
-1.53899193e-04 -2.13304535e-04 -5.68494201e-04 -2.48316675e-04
6.97672367e-05 3.95789975e-05 4.20673867e-04 1.22796744e-04
3.01668770e-04 2.88983807e-04 -2.27622688e-04 4.61582094e-05
1.50222331e-04 5.10662794e-05 -2.75790691e-04 -4.64348122e-04
1.50432810e-04 -1.98451802e-04 -5.17979264e-04 -5.52180223e-04
-1.91964209e-04 4.79239970e-05 1.83504075e-04 3.51462513e-04
5.20255417e-04 -5.49077056e-04 -5.55290841e-04 2.75209546e-04
-2.32453924e-04 -6.61332160e-04 4.10135835e-05 2.40752473e-04
-4.35065478e-04 5.78388572e-05 -5.35553321e-04 1.36308372e-04
6.05644658e-04 -4.55751549e-04 4.09148633e-05 1.41561031e-06
1.48963183e-04 -1.88861042e-04 2.97532417e-04 -2.73436308e-05
6.03377819e-04 -7.49118626e-05 -4.08329070e-05 -2.30047852e-04
-1.20595098e-04 1.34214759e-04 -2.52332538e-04 -6.24042004e-04
4.13861126e-04 1.52524561e-04 -2.99341977e-04 1.78590417e-05
4.21278179e-04 2.18518078e-04 3.13905999e-04 4.64275479e-04
1.00035220e-04 -5.82799315e-04 -2.31023878e-04 3.56823206e-04
-2.51505524e-04 1.11836940e-04 -2.51635909e-04 4.03001904e-04
-3.71186063e-04 -9.80189070e-05 -1.90600753e-04 1.27118081e-04
-4.41980548e-04 -9.57250595e-05 -5.64232469e-04 -7.28752464e-04
-1.33565627e-05 2.22846866e-05 -2.40376918e-04 7.84248114e-05
3.67172761e-04 5.28875738e-04 -3.28272581e-05 -1.47173414e-04
1.73831359e-04 -3.00290063e-04 -1.84352975e-04 -2.70418823e-04
6.73606992e-04 1.78738497e-04 5.68114221e-04 7.51763582e-06
2.83452682e-04 2.60267407e-04 -8.97981226e-05 -4.72381711e-04
-3.60846519e-04 5.21302223e-04 -8.85989517e-04 -3.10130417e-04
-1.85526907e-04 -6.16163015e-05 -7.06240535e-05 2.56404281e-04
2.86150724e-04 -2.37233937e-04 1.26801431e-04 -1.42356381e-04
3.60529870e-04 -3.00109386e-04 -4.73760068e-04 -1.02807209e-03
2.33627856e-04 5.74629521e-05 -4.45742160e-04 -2.81635672e-04
-2.04078853e-04 4.62248921e-04 -4.05784696e-04 -3.28447670e-04
-4.43700701e-04 -1.68975443e-04 2.19687819e-04 1.27624720e-04
-8.84756446e-05 4.67114151e-04 4.08012420e-06 6.84559345e-05
1.13315880e-04 -3.65212560e-04 -1.33501366e-04 -2.94370577e-04
-5.92321157e-06 6.55129552e-05 2.79374421e-04 -1.44112855e-04
-1.57050788e-04 1.47652812e-04 -1.29436143e-04 -2.09135935e-04
7.05868006e-05 2.87716743e-04 1.69873238e-05 2.70985067e-04
-4.74393368e-04 1.18675176e-04 7.16783106e-05 2.09966674e-05
-8.08089972e-04 -3.28132883e-04 -2.26557255e-04 -2.33748928e-04
2.72260979e-04 1.35331415e-04 4.04806808e-04 -6.74679875e-04
4.14922833e-04 -1.91716477e-04 1.39487907e-04 2.54428014e-05
Wonder how do you run it?
from facenet-caffe.
thanks ,i have solver it , var1 = "sess.run(tf.get_default_graph().get_tensor_by_name('InceptionResnetV1/Conv2d_1a_3x3/BatchNorm/beta:0') net.params['Conv2d_1a_3x3/BatchNorm'][0].data[...] = var1 " should put before the 'Conv_BN_Scale_Relu('Conv2d_1a_3x3', 'InceptionResnetV1/Conv2d_1a_3x3', sess, net)'
from facenet-caffe.
@LeeTaiTai
Hi, I find the reason perhaps you had not do prewhiten
under calcTFVector
I have fixed this bug.
from facenet-caffe.
Related Issues (4)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from facenet-caffe.