Nuance NLU online project
Cookit is a web platform for Nuance NLU data management, online word segmentation and model validation.
python2.7 x86+django+mysql
Create schema ‘cookit’ in mysql, then
python manage.py makemigrations
python manage.py migrate
Your data models under NLU/models.py
will be sync into db.
Cookit requires user login to access NLU online part,so we need to create an account in backend.
python manage.py adduser
Cookit supports three kinds of NLU data:
- corpus
- hrl
- pattern
1. Corpus data template
// no header
Ent.Pause 停止 播放
Ent.Continue 恢复 听 歌
2. Hrl data template
// with header
#head;hrl;2.0;utf-8
#ref#speechfile#speaker#gender#reference word sequence#topic#;slot names#;slot values
head
ref#Blu/009s003.pcm#Blu#male#请打开收音机界面#INTENT_Radio_ShowRadio##
ref#Blu/009s034.pcm#Blu#male#有收音机界面吗#INTENT_Radio_ShowRadio##
3. Pattern data template
Apps.CloseApp APP_NM 不想 听 了
Apps.ShowMenu APP_MENU_NM 打开
Put your all data files under \Cookit\static\data
The data file should be categorized by its suffix:
- Corpus: *.cop
- Hrl: *.hrl
- Pattern: *.pat
Then run python manage.py syncdata
, all the data will be sync to the db automatically.
Cookit web part provides ...
waiting...