Comments (3)
Hi,
when you measured the accuracy of the eye-eyebrows (EE22) model, did you run the following code from the training_script?
...
train_xml = "eye_eyebrows_22_train.xml"
test_xml = "eye_eyebrows_22_test.xml"
dat = "eye_eyebrows_22.dat"
slice_xml(train_labels, train_xml, parts)
slice_xml(test_labels, test_xml, parts)
# training
...
# compute traning and test error
measure_model_error(dat, train_xml)
measure_model_error(dat, test_xml)
because I've measured the test error and got 8.40
as depicted here.
Also I've check the predictions, and their quality is quite good. Keep in mind that the model I've made, are built to be fast (during inference) and lightweight in size. So, I had to trade accuracy for speed and size.
Anyway, getting a better accuracy is a matter of changing the training parameters.
from dlib-minified-models.
Thank you!, i followed the scripts of the Medium page and only the part of creating the slice_xml file to train comes, therefore the comparison was with the complete set. i fixed this and reach predicts with an error between 8 and 11.
Now I have another little problem, I train and test my predictors on my desktop computer and it works fine. When the attempts move the files to my notebook or a jetson, the files get corrupted, and I tried to compress them and use different storage devices, always when loading them mark:
Tracking (latest recent calls):
File "video_facial_landmarks.py", line 30, in
predictor = dlib.shape_predictor (args ["shape_predictor"])
RuntimeError: error deserializing the short object
while deserializing a floating point number.
while deserializing a dlib matrix:
while the object of type std :: vector is deserialized
while the object of type std :: vector is deserialized
while the object of type std :: vector is deserialized
Thanks a lot for you tutorials, best regards.
from dlib-minified-models.
Well, is a strange issue because I've build some models (included the one in the repo) on a desktop platform, and when I've tested them on an Android device they worked fine.
Maybe the desktop version of Dlib encodes the model with a different format, than the notebook/jetson version. You can check the version of the libraries, and also compilation flags and eventual missing libraries (but it should not be the case).
If the issue still persist, I think that you should contact the Dlib author and eventually open an issue on the Dlib repository.
from dlib-minified-models.
Related Issues (6)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dlib-minified-models.