Comments (10)
Hello @beshoo could you please try this
cd fasttext.js/examples
node train
node server
and then point your browser to http://localhost:3000/?text=beshoo
you should get for the example model and dataset always this response like doing
$ curl "http://localhost:3000/?text=beshoo" | json_pp
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 174 100 174 0 0 27835 0 --:--:-- --:--:-- --:--:-- 29000
{
"response_time" : 0.001,
"predict" : [
{
"score" : "0.5",
"label" : "BAND"
},
{
"label" : "ORGANIZATION",
"score" : "0.498047"
}
]
}
as well as
wget -qO- http://localhost:3000/?text=beshoo
{
"response_time": 0.001,
"predict": [
{
"label": "BAND",
"score": "0.5"
},
{
"label": "ORGANIZATION",
"score": "0.498047"
}
]
}
from fasttext.js.
Well the problem i dont have the text labeled data, to retrain it.
But here is the bin module, this is gender classified model
http://beshoo.com/gender.bin.gz
Try it please...
from fasttext.js.
ot@server [/home/mybeshoo/www]# wget -qO- http://local
host:3300/?text=beshoo
{
"response_time": 4.909,
"predict": [
{
"label": "FEMALE",
"score": "0.998047"
},
{
"label": "MALE",
"score": "1.95313E-08"
}
]
}
Now lets try again
r```
oot@server [/home/mybeshoo/www]# wget -qO- http://loca
host:3300/?text=beshoo
{
"response_time": 1.156,
"predict": [
{
"label": "MALE",
"score": "0.794922"
},
{
"label": "FEMALE",
"score": "0.203125"
}
]
}
from fasttext.js.
@beshoo that sounds weird! Thanks I will take a look at the generated model. In the meanwhile I have tried a facebook pre-trained languages model and it seems okay:
cd examples/
export MODEL=data/lid.176.ftz
http://localhost:3000/?text=das%20is%20schon
{
"predict" : [
{
"score" : "0.745016",
"label" : "DE"
},
{
"score" : "0.232697",
"label" : "EN"
}
],
"response_time" : 0
}
both the quantized model and the non compressed one:
[loretoparisi@:mbploreto examples]$ ls -lh data/lid.176.ftz
-rw-r--r--@ 1 loretoparisi staff 916K 19 Ott 23:50 data/lid.176.ftz
[loretoparisi@:mbploreto examples]$ ls -lh /root/lid176_model.bin
-rw-r--r-- 1 loretoparisi staff 125M 10 Ott 16:25 /root/lid176_model.bin
Going to check yours then. Are you running on mac/linux/windows?
Also it seems that in your response there are very different response time from the two output "response_time": 4.909,
and "response_time": 1.156,
that seems to me strange.
from fasttext.js.
I am on Linux 😎
from fasttext.js.
@beshoo ok thanks going to test both macos and linux then again.
from fasttext.js.
@beshoo so I did the following
Created and built Dockerfile
in this repo to check the linux version. I'm using Ubuntu16.04
here:
docker build -t fasttext.js -t
Tested your model against it:
docker run -v /models/:/models --rm -it -p 3000:3000 -e MODEL=/models/gender.bin fasttext.js node fasttext.js/examples/server.js
[loretoparisi@:mbploreto fasttext.js]$ curl http://localhost:3000/?text=I%20love%cars | json_pp
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 169 100 169 0 0 24257 0 --:--:-- --:--:-- --:--:-- 28166
{
"response_time" : 0,
"predict" : [
{
"label" : "MALE",
"score" : "0.855469"
},
{
"label" : "FEMALE",
"score" : "0.142578"
}
]
}
[loretoparisi@:mbploreto fasttext.js]$ curl http://localhost:3000/?text=I%20love%dressing | json_pp
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 173 100 173 0 0 22718 0 --:--:-- --:--:-- --:--:-- 24714
{
"predict" : [
{
"score" : "0.535156",
"label" : "FEMALE"
},
{
"score" : "0.462891",
"label" : "MALE"
}
],
"response_time" : 0.001
}
and the same text multiple times as well:
[loretoparisi@:mbploreto fasttext.js]$ for ((n=0;n<10;n++)); do curl http://localhost:3000/?text=I%20love%dressing; done
{
"response_time": 0.001,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0.001,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0.001,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0.001,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0.001,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}{
"response_time": 0.001,
"predict": [
{
"label": "FEMALE",
"score": "0.535156"
},
{
"label": "MALE",
"score": "0.462891"
}
]
}
and in your example:
[loretoparisi@:mbploreto fasttext.js]$ for ((n=0;n<5;n++)); do curl http://localhost:3000/?text=beshoo; done
{
"response_time": 0.001,
"predict": [
{
"label": "FEMALE",
"score": "0.736328"
},
{
"label": "MALE",
"score": "0.261719"
}
]
}{
"response_time": 0,
"predict": [
{
"label": "FEMALE",
"score": "0.736328"
},
{
"label": "MALE",
"score": "0.261719"
}
]
}{
"response_time": 0,
"predict": [
{
"label": "FEMALE",
"score": "0.736328"
},
{
"label": "MALE",
"score": "0.261719"
}
]
}{
"response_time": 0,
"predict": [
{
"label": "FEMALE",
"score": "0.736328"
},
{
"label": "MALE",
"score": "0.261719"
}
]
}{
"response_time": 0.001,
"predict": [
{
"label": "FEMALE",
"score": "0.736328"
},
{
"label": "MALE",
"score": "0.261719"
}
]
Everything seems to work ok. Are you running any proxy before your web server listening on 3030? Also which version of linux are you running?
from fasttext.js.
No proxy at all
Linux Server release 6.9
Btw, it is something come and go, like i am testing on my server now, it works and same result came back , but i note something
when service returned multiple labels, i mean the error scenario, when i press enter it take like 3sec to return the output, yes the service is online. Do think its some kind of ddos. Not sure..
But believe me its happening....
Now when i hit enter, output return within less than sec
from fasttext.js.
I tried to bunchmark the service vi ab but it returns the correct result.
I am not sure
from fasttext.js.
I'm closing this issue, feel free to reopen it if you have additional questions or further issues.
from fasttext.js.
Related Issues (20)
- Time for scale up, fastText.Js with Redis and Clustering mode HOT 7
- better error message when .bin is not found on load() HOT 5
- Move repo to a GitHub *org* HOT 9
- async/await version HOT 3
- bug in FastText.prototype.nn() HOT 1
- calculate distance feature? HOT 4
- DeprecationWarning: Buffer() is deprecated due to security and usability issues when i move my script to another server
- Executable not found in path on Windows HOT 7
- Question : usage of pretrainedVectors? HOT 1
- Trying to get in touch regarding a security issue HOT 1
- [Question] Are the binaries provided always up to date with the latest version of fasttext ? HOT 1
- Using in the browser with WASM HOT 2
- I cannot run this project on apple M1 silicon HOT 4
- Different sentence vector compared to Python HOT 1
- npm module: loadSentence is not a function HOT 2
- Incremental training
- Fail to load WASM error HOT 4
- I like Your code in https://github.com/huggingface/tokenizers/issues/1076 HOT 4
- This package cannot be used in node.js worker thread HOT 1
- Is it library outdated?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from fasttext.js.