Giter Site home page Giter Site logo

Comments (13)

nok avatar nok commented on June 9, 2024

Hello @Gizmomens ,

the integrity check isn't supported on Windows operation systems. Currently I can't help you, because your issue isn't reproducible. Because of that upload and share the original model (in pickle format) and the used data.

Darius

from sklearn-porter.

Gizmomens avatar Gizmomens commented on June 9, 2024

Hi, thanks for the response! I've attached the model and the data set, the original scikit model on python was giving me an accuracy of ~70%.

model_data.zip

from sklearn-porter.

nok avatar nok commented on June 9, 2024

Hello @Gizmomens,

by using your provided model and data the integrity check returned an accuracy of 0.993849938499385 (1 = best).

(In addition I found a small bug and the case, that non-numeric data is used for the integrity check, which can cause internal errors. Both issues will be fixed in the next patch 0.6.2.)

Darius

from sklearn-porter.

Gizmomens avatar Gizmomens commented on June 9, 2024

So the low accuracy in the exported Java model is because of these bugs then?

from sklearn-porter.

nok avatar nok commented on June 9, 2024

No, it's not, the bugs effect just the integrity check.

I guess you overfit your model or the data on your device have changed.

from sklearn-porter.

Gizmomens avatar Gizmomens commented on June 9, 2024

But the accuracy is bad even if I use the same test data in Java. The scikit model gives 70% on that test set while the Java code gives me around 10%. I debugged the java code and everything is fine, the features get put in as they should, it's just that the trees never return the right class.

from sklearn-porter.

nok avatar nok commented on June 9, 2024

Here are the results with a integrity score of 0.993849938499385:
Your used labels: 2 = Left, 1 = Below, 0 = Above, 3 = Right:

Sample index | prediction from Python | prediction from Java (transpiled by the porter) | equal?
0 2 2 True
1 1 1 True
2 2 2 True
3 1 1 True
4 2 2 True
5 2 2 True
6 2 2 True
7 2 2 True
8 1 1 True
9 2 2 True
10 2 2 True
11 1 1 True
12 2 2 True
13 2 2 True
14 2 2 True
15 2 2 True
16 1 1 True
17 1 1 True
18 2 2 True
19 2 2 True
20 1 1 True
21 1 1 True
22 2 2 True
23 2 2 True
24 2 2 True
25 2 2 True
26 2 2 True
27 2 2 True
28 2 2 True
29 2 2 True
30 2 2 True
31 1 1 True
32 2 2 True
33 2 2 True
34 1 1 True
35 2 2 True
36 2 2 True
37 2 2 True
38 2 2 True
39 2 2 True
40 2 2 True
41 2 2 True
42 2 2 True
43 2 2 True
44 2 2 True
45 2 2 True
46 2 2 True
47 2 2 True
48 2 2 True
49 2 2 True
50 2 2 True
51 2 2 True
52 2 2 True
53 2 2 True
54 2 2 True
55 2 2 True
56 2 2 True
57 2 2 True
58 2 2 True
59 2 2 True
60 2 2 True
61 2 2 True
62 2 2 True
63 2 2 True
64 2 2 True
65 2 2 True
66 2 2 True
67 2 2 True
68 2 2 True
69 2 2 True
70 1 1 True
71 2 2 True
72 2 2 True
73 2 2 True
74 2 2 True
75 0 0 True
76 2 2 True
77 2 2 True
78 2 2 True
79 2 2 True
80 2 2 True
81 2 2 True
82 2 2 True
83 2 2 True
84 2 2 True
85 1 1 True
86 2 2 True
87 2 2 True
88 2 2 True
89 2 2 True
90 2 2 True
91 2 2 True
92 2 2 True
93 1 1 True
94 2 2 True
95 0 0 True
96 2 2 True
97 1 1 True
98 2 2 True
99 2 2 True
100 2 2 True
101 2 2 True
102 2 2 True
103 2 2 True
104 1 1 True
105 2 2 True
106 2 2 True
107 2 2 True
108 2 2 True
109 2 2 True
110 2 2 True
111 2 2 True
112 1 1 True
113 2 2 True
114 2 2 True
115 2 2 True
116 2 2 True
117 2 2 True
118 2 2 True
119 2 2 True
120 2 2 True
121 2 2 True
122 2 2 True
123 2 2 True
124 2 2 True
125 2 2 True
126 2 2 True
127 2 2 True
128 2 2 True
129 2 2 True
130 2 2 True
131 1 1 True
132 2 2 True
133 2 2 True
134 1 1 True
135 2 2 True
136 0 0 True
137 1 1 True
138 2 2 True
139 2 2 True
140 2 2 True
141 2 2 True
142 2 2 True
143 2 2 True
144 2 2 True
145 2 2 True
146 2 2 True
147 2 2 True
148 2 2 True
149 2 2 True
150 2 2 True
151 2 2 True
152 1 1 True
153 2 2 True
154 2 2 True
155 2 2 True
156 2 2 True
157 1 1 True
158 2 2 True
159 2 2 True
160 2 2 True
161 2 2 True
162 2 2 True
163 2 2 True
164 1 1 True
165 1 1 True
166 2 2 True
167 2 2 True
168 1 1 True
169 1 1 True
170 1 1 True
171 2 2 True
172 2 2 True
173 2 2 True
174 2 2 True
175 2 2 True
176 2 2 True
177 2 2 True
178 2 2 True
179 2 2 True
180 1 1 True
181 2 2 True
182 2 2 True
183 1 1 True
184 2 2 True
185 2 2 True
186 2 2 True
187 1 1 True
188 2 2 True
189 2 2 True
190 2 2 True
191 2 2 True
192 2 2 True
193 2 2 True
194 2 2 True
195 1 1 True
196 2 2 True
197 2 2 True
198 3 3 True
199 2 2 True
200 1 1 True
201 2 2 True
202 2 2 True
203 2 2 True
204 2 2 True
205 2 2 True
206 2 2 True
207 2 2 True
208 2 2 True
209 2 2 True
210 2 2 True
211 2 2 True
212 2 2 True
213 2 2 True
214 2 2 True
215 2 2 True
216 2 2 True
217 2 2 True
218 2 2 True
219 2 2 True
220 2 2 True
221 2 2 True
222 2 2 True
223 2 2 True
224 2 2 True
225 2 2 True
226 2 2 True
227 2 2 True
228 2 2 True
229 2 2 True
230 2 2 True
231 2 2 True
232 2 2 True
233 1 1 True
234 2 2 True
235 2 2 True
236 2 2 True
237 2 2 True
238 2 2 True
239 2 2 True
240 1 1 True
241 2 2 True
242 2 2 True
243 2 2 True
244 2 2 True
245 1 1 True
246 2 2 True
247 1 1 True
248 2 2 True
249 2 2 True
250 2 2 True
251 2 2 True
252 2 2 True
253 1 1 True
254 2 2 True
255 2 2 True
256 2 2 True
257 2 2 True
258 2 2 True
259 2 2 True
260 1 1 True
261 2 2 True
262 2 2 True
263 2 2 True
264 2 2 True
265 2 2 True
266 2 2 True
267 2 2 True
268 2 2 True
269 2 2 True
270 2 2 True
271 1 1 True
272 2 2 True
273 1 1 True
274 1 1 True
275 2 2 True
276 2 2 True
277 1 1 True
278 2 2 True
279 2 2 True
280 1 1 True
281 2 2 True
282 1 1 True
283 2 2 True
284 2 2 True
285 2 2 True
286 2 2 True
287 1 1 True
288 2 2 True
289 2 2 True
290 2 2 True
291 2 2 True
292 1 1 True
293 2 2 True
294 2 2 True
295 2 2 True
296 2 2 True
297 2 2 True
298 2 2 True
299 1 1 True
300 2 2 True
301 2 2 True
302 2 2 True
303 2 2 True
304 2 2 True
305 2 2 True
306 2 2 True
307 1 1 True
308 2 2 True
309 2 2 True
310 2 2 True
311 2 2 True
312 2 2 True
313 2 2 True
314 2 2 True
315 2 2 True
316 2 2 True
317 1 1 True
318 2 2 True
319 2 2 True
320 2 2 True
321 2 2 True
322 2 2 True
323 2 2 True
324 2 2 True
325 2 2 True
326 2 2 True
327 1 1 True
328 2 2 True
329 2 2 True
330 2 2 True
331 2 2 True
332 2 2 True
333 2 2 True
334 2 2 True
335 2 2 True
336 1 1 True
337 1 1 True
338 1 1 True
339 2 2 True
340 2 2 True
341 2 2 True
342 2 2 True
343 2 2 True
344 2 2 True
345 2 2 True
346 2 2 True
347 1 1 True
348 1 1 True
349 2 2 True
350 2 2 True
351 1 1 True
352 2 2 True
353 2 2 True
354 1 1 True
355 1 1 True
356 1 1 True
357 2 2 True
358 2 2 True
359 2 2 True
360 2 2 True
361 2 2 True
362 2 2 True
363 2 2 True
364 1 1 True
365 1 1 True
366 2 2 True
367 2 2 True
368 2 2 True
369 2 2 True
370 1 1 True
371 1 1 True
372 2 2 True
373 2 2 True
374 2 2 True
375 1 1 True
376 1 1 True
377 1 1 True
378 1 1 True
379 2 2 True
380 2 2 True
381 2 2 True
382 2 2 True
383 2 2 True
384 1 1 True
385 1 1 True
386 2 2 True
387 1 1 True
388 1 1 True
389 2 2 True
390 2 2 True
391 1 1 True
392 1 1 True
393 1 1 True
394 2 2 True
395 2 2 True
396 2 2 True
397 2 2 True
398 2 2 True
399 2 2 True
400 1 1 True
401 2 2 True
402 2 2 True
403 2 2 True
404 2 2 True
405 2 2 True
406 2 2 True
407 2 2 True
408 2 2 True
409 2 2 True
410 2 2 True
411 2 2 True
412 1 1 True
413 2 2 True
414 2 2 True
415 1 1 True
416 2 2 True
417 2 2 True
418 2 2 True
419 2 2 True
420 1 1 True
421 2 2 True
422 2 2 True
423 1 1 True
424 1 1 True
425 2 2 True
426 2 2 True
427 2 2 True
428 1 1 True
429 2 2 True
430 1 1 True
431 2 2 True
432 2 2 True
433 1 1 True
434 2 2 True
435 2 2 True
436 2 2 True
437 1 1 True
438 2 2 True
439 1 1 True
440 2 2 True
441 2 2 True
442 2 2 True
443 2 2 True
444 2 2 True
445 2 2 True
446 2 2 True
447 1 1 True
448 1 1 True
449 2 2 True
450 2 2 True
451 1 1 True
452 2 2 True
453 1 1 True
454 2 2 True
455 2 2 True
456 2 2 True
457 2 2 True
458 2 2 True
459 2 2 True
460 2 2 True
461 1 1 True
462 2 2 True
463 2 2 True
464 1 1 True
465 1 1 True
466 1 1 True
467 2 2 True
468 2 2 True
469 2 2 True
470 1 1 True
471 1 1 True
472 2 2 True
473 2 2 True
474 2 2 True
475 2 2 True
476 1 1 True
477 1 1 True
478 2 2 True
479 1 1 True
480 2 2 True
481 2 2 True
482 2 2 True
483 1 1 True
484 2 2 True
485 2 2 True
486 2 2 True
487 1 1 True
488 2 2 True
489 2 2 True
490 1 1 True
491 2 2 True
492 2 2 True
493 2 2 True
494 2 2 True
495 2 2 True
496 2 2 True
497 2 2 True
498 2 2 True
499 2 2 True
500 1 1 True
501 2 2 True
502 2 2 True
503 2 2 True
504 2 2 True
505 2 2 True
506 2 2 True
507 2 2 True
508 2 2 True
509 2 2 True
510 1 1 True
511 2 2 True
512 1 1 True
513 2 2 True
514 1 1 True
515 1 1 True
516 2 2 True
517 1 1 True
518 2 2 True
519 2 2 True
520 1 1 True
521 1 1 True
522 1 1 True
523 1 1 True
524 2 2 True
525 2 2 True
526 1 1 True
527 1 1 True
528 2 2 True
529 2 2 True
530 2 2 True
531 2 2 True
532 1 1 True
533 1 1 True
534 1 1 True
535 2 2 True
536 2 2 True
537 1 1 True
538 1 1 True
539 1 1 True
540 2 2 True
541 2 2 True
542 2 2 True
543 2 2 True
544 2 2 True
545 2 2 True
546 1 1 True
547 2 2 True
548 2 2 True
549 1 1 True
550 1 1 True
551 2 2 True
552 1 1 True
553 2 2 True
554 2 2 True
555 2 2 True
556 1 1 True
557 1 1 True
558 1 1 True
559 1 1 True
560 2 2 True
561 2 2 True
562 2 2 True
563 2 2 True
564 2 2 True
565 2 2 True
566 2 2 True
567 2 2 True
568 1 1 True
569 2 2 True
570 2 2 True
571 2 2 True
572 2 2 True
573 2 2 True
574 2 2 True
575 1 1 True
576 2 2 True
577 2 2 True
578 1 1 True
579 2 2 True
580 1 1 True
581 2 2 True
582 2 2 True
583 1 1 True
584 1 1 True
585 2 2 True
586 2 2 True
587 2 2 True
588 2 2 True
589 2 2 True
590 2 2 True
591 2 2 True
592 2 2 True
593 1 1 True
594 2 2 True
595 2 1 False
596 2 2 True
597 2 2 True
598 1 1 True
599 1 1 True
600 2 2 True
601 2 2 True
602 2 2 True
603 2 2 True
604 2 2 True
605 2 2 True
606 1 1 True
607 1 1 True
608 2 2 True
609 2 2 True
610 2 2 True
611 2 2 True
612 2 2 True
613 2 1 False
614 2 2 True
615 1 1 True
616 2 2 True
617 2 2 True
618 2 2 True
619 2 2 True
620 2 2 True
621 2 2 True
622 2 2 True
623 1 1 True
624 2 2 True
625 2 2 True
626 1 1 True
627 2 2 True
628 2 2 True
629 2 2 True
630 2 2 True
631 2 2 True
632 2 2 True
633 2 2 True
634 2 2 True
635 2 2 True
636 1 1 True
637 1 2 False
638 2 2 True
639 2 2 True
640 2 2 True
641 1 1 True
642 2 2 True
643 2 2 True
644 1 1 True
645 2 2 True
646 2 1 False
647 2 2 True
648 2 2 True
649 1 1 True
650 2 2 True
651 1 1 True
652 2 2 True
653 2 2 True
654 1 1 True
655 1 1 True
656 2 2 True
657 1 1 True
658 1 1 True
659 2 2 True
660 1 1 True
661 1 1 True
662 2 2 True
663 1 1 True
664 2 2 True
665 2 2 True
666 2 2 True
667 2 2 True
668 1 1 True
669 2 2 True
670 2 2 True
671 1 1 True
672 2 2 True
673 2 2 True
674 2 2 True
675 2 2 True
676 2 2 True
677 2 2 True
678 2 2 True
679 2 2 True
680 1 1 True
681 2 2 True
682 2 2 True
683 1 1 True
684 2 2 True
685 1 1 True
686 2 2 True
687 2 2 True
688 2 2 True
689 1 1 True
690 1 1 True
691 2 2 True
692 2 2 True
693 2 2 True
694 1 1 True
695 2 2 True
696 2 2 True
697 2 2 True
698 2 2 True
699 2 2 True
700 2 2 True
701 2 2 True
702 2 2 True
703 2 2 True
704 2 2 True
705 2 2 True
706 2 2 True
707 2 2 True
708 2 2 True
709 2 2 True
710 2 2 True
711 2 2 True
712 2 2 True
713 2 2 True
714 2 2 True
715 2 2 True
716 1 1 True
717 2 2 True
718 2 2 True
719 2 2 True
720 2 2 True
721 1 1 True
722 2 2 True
723 2 2 True
724 2 2 True
725 2 2 True
726 2 2 True
727 1 1 True
728 2 2 True
729 2 2 True
730 1 2 False
731 2 2 True
732 2 2 True
733 2 2 True
734 2 2 True
735 1 1 True
736 1 1 True
737 2 2 True
738 2 2 True
739 2 2 True
740 2 2 True
741 2 2 True
742 2 2 True
743 2 2 True
744 2 2 True
745 2 2 True
746 2 2 True
747 1 1 True
748 1 1 True
749 2 2 True
750 2 2 True
751 2 2 True
752 2 2 True
753 2 2 True
754 2 2 True
755 2 2 True
756 2 2 True
757 2 2 True
758 2 2 True
759 1 1 True
760 2 2 True
761 2 2 True
762 2 2 True
763 2 2 True
764 2 2 True
765 2 2 True
766 2 2 True
767 2 2 True
768 2 2 True
769 2 2 True
770 2 2 True
771 2 2 True
772 1 1 True
773 2 2 True
774 2 2 True
775 2 2 True
776 2 2 True
777 2 2 True
778 2 2 True
779 2 2 True
780 2 2 True
781 2 2 True
782 2 2 True
783 2 2 True
784 2 2 True
785 2 2 True
786 2 2 True
787 2 2 True
788 2 2 True
789 2 2 True
790 2 2 True
791 2 2 True
792 1 1 True
793 2 2 True
794 2 2 True
795 2 2 True
796 2 2 True
797 2 2 True
798 2 2 True
799 1 1 True
800 2 2 True
801 2 2 True
802 2 2 True
803 2 2 True
804 2 2 True
805 2 2 True
806 2 2 True
807 2 2 True
808 1 1 True
809 2 2 True
810 1 1 True
811 2 2 True
812 2 2 True

Overall the result looks good to me. Do you have more details for further debugging?

Darius

from sklearn-porter.

Gizmomens avatar Gizmomens commented on June 9, 2024

This can't be right. The Java code also returns the classes "2" and "1" most of the time just as shown above but the data setis almost equally distributed in terms of class labels!! and in scikit I get the right labels 70% or so of the time. In the above integrity check classes 0 and 3 are never predicted and it happens exactly like that in the eclipse but it shouldn't be that way. The python model returns all the classes!

Just take a look at the "Training.csv" file attached in the above post, scroll all the way to right and you'll see all 4 labels are present in almost equal amounts.

from sklearn-porter.

nok avatar nok commented on June 9, 2024

Yes, I understand everything, but the quality of your original trained estimator is weak.

Small data analysis:

df = pd.read_csv('training.csv', sep=',')
print(pd.value_counts(df['direction'].values, sort=False))
# Left     204
# Below    214
# Right    191
# Above    204

Drop last column to get just the features:

x = df.drop(labels='direction', axis=1).as_matrix()

Load the original provided model:

clf = joblib.load('model.pkl')
y_original = clf.predict(x)

All the predictions:

['Left' 'Below' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Below' 'Below' 'Left' 'Left'
 'Below' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Above' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Below' 'Left' 'Above' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Below' 'Left' 'Above' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Below' 'Below' 'Left' 'Left' 'Below' 'Below'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Below' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Right' 'Left'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Below' 'Below' 'Left' 'Left' 'Below' 'Left' 'Left'
 'Below' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Below' 'Below' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Below' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Below' 'Below' 'Below' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Below' 'Below' 'Left' 'Left' 'Left' 'Left'
 'Below' 'Below' 'Left' 'Left' 'Left' 'Below' 'Below' 'Below' 'Below'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Below' 'Left' 'Below' 'Below'
 'Left' 'Left' 'Below' 'Below' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Below' 'Below' 'Left' 'Left' 'Left' 'Below'
 'Left' 'Below' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Below' 'Left'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Below'
 'Left' 'Left' 'Below' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Below' 'Left' 'Left' 'Below' 'Below' 'Below' 'Left' 'Left'
 'Left' 'Below' 'Below' 'Left' 'Left' 'Left' 'Left' 'Below' 'Below' 'Left'
 'Below' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Below' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Below' 'Left' 'Below' 'Below' 'Left' 'Below'
 'Left' 'Left' 'Below' 'Below' 'Below' 'Below' 'Left' 'Left' 'Below'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Below' 'Below' 'Below' 'Left' 'Left'
 'Below' 'Below' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below'
 'Left' 'Left' 'Below' 'Below' 'Left' 'Below' 'Left' 'Left' 'Left' 'Below'
 'Below' 'Below' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left'
 'Left' 'Below' 'Left' 'Below' 'Left' 'Left' 'Below' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Below'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below'
 'Below' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Below' 'Left' 'Below' 'Left' 'Left' 'Below' 'Below' 'Left'
 'Below' 'Below' 'Left' 'Below' 'Below' 'Left' 'Below' 'Left' 'Left'
 'Left' 'Left' 'Below' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Below' 'Left' 'Below'
 'Left' 'Left' 'Left' 'Below' 'Below' 'Left' 'Left' 'Left' 'Below' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Below'
 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Below' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Left' 'Below' 'Left' 'Left' 'Left' 'Left' 'Left' 'Left'
 'Left' 'Left' 'Below' 'Left' 'Below' 'Left' 'Left']

Analyse the original predictions:

y_df = pd.DataFrame(y_original, columns=['direction'])
print(y_df.groupby(['direction']).size())
# Above      3
# Below    175
# Left     634
# Right      1

Finally the transpiled version matches the original predictions up to 99.38 percents (see my previous comment).

Darius

from sklearn-porter.

Gizmomens avatar Gizmomens commented on June 9, 2024

I see, but what I don't understand is, if the estimator is that weak, how was scikit giving me such a good result?

RandomForestClassifier(bootstrap=True, class_weight=None, criterion='gini',
            max_depth=None, max_features='auto', max_leaf_nodes=None,
            min_impurity_decrease=0.0, min_impurity_split=None,
            min_samples_leaf=1, min_samples_split=2,
            min_weight_fraction_leaf=0.0, n_estimators=50, n_jobs=1,
            oob_score=False, random_state=0, verbose=0, warm_start=False)
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Right
Actual outcome :: Above and Predicted outcome :: Right
Actual outcome :: Above and Predicted outcome :: Right
Actual outcome :: Above and Predicted outcome :: Left
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Left
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Below
Actual outcome :: Below and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Below
Actual outcome :: Below and Predicted outcome :: Below
Actual outcome :: Below and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Below
Actual outcome :: Below and Predicted outcome :: Below
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Left and Predicted outcome :: Below
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Left and Predicted outcome :: Below
Actual outcome :: Left and Predicted outcome :: Right
Actual outcome :: Left and Predicted outcome :: Right
Actual outcome :: Left and Predicted outcome :: Right
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Below
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Above
Actual outcome :: Left and Predicted outcome :: Right
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Right
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Above and Predicted outcome :: Above
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Right and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Below
Actual outcome :: Below and Predicted outcome :: Right
Actual outcome :: Below and Predicted outcome :: Below
Actual outcome :: Below and Predicted outcome :: Above
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Left and Predicted outcome :: Left
Actual outcome :: Left and Predicted outcome :: Right
Train Accuracy ::  1.0
Test Accuracy  ::  0.6617647058823529

this is what I get in that. This is the exact same model I sent you in pickle format.

from sklearn-porter.

nok avatar nok commented on June 9, 2024

Hello @Gizmomens,

could you resolve that strange behaviour? Which package versions did you use? Please post the output of conda env export and pip list. I found this commit #31a4691 which handles different versions of pickle (source).

Darius

from sklearn-porter.

Gizmomens avatar Gizmomens commented on June 9, 2024
C:\Python\Scripts>pip list
Message: 'beautifulsoup4 (4.6.0)'
Arguments: ()
beautifulsoup4 (4.6.0)bleach (1.5.0)
enum34 (1.1.6)
html5lib (0.9999999)
Markdown (2.6.11)
numpy (1.14.0)
pip (8.1.1)
protobuf (3.5.1)
scikit-learn (0.19.1)
scipy (1.0.0)
setuptools (20.10.1)
six (1.11.0)
sklearn (0.0)
sklearn-porter (0.6.1)
tensorflow (1.4.0)
tensorflow-tensorboard (0.4.0)
Werkzeug (0.14.1)
wheel (0.30.0)

from sklearn-porter.

nok avatar nok commented on June 9, 2024

Could you figure out the mismatch? Did you dump the file with the attribute compress=0?

from sklearn-porter.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.