Comments (22)
also re: StateMap - yes I will finalize that object as soon as we are done with UI updates, and then you can point to specific steps to bind to specific bots.
from bot_or_not.
from bot_or_not.
I have tried replicating this but can't :/
UPDATE: I see the issue, but its not a problem of clearing the bot queue - it is reset on every round.
from bot_or_not.
hmmm. ok I'll do some detective work and get back to you
from bot_or_not.
from bot_or_not.
looking into it - for the moment I'm wondering if all the if
statements can be handled more elegantly to avoid time lags?
from bot_or_not.
oh yeah for sure, I'll look at that now
from bot_or_not.
from bot_or_not.
I think the clearing is happening on line 75 of App.js this.botQueue = this.botQueue.concat(messages);
from bot_or_not.
hm, but concat
doesn't clear anything?
from bot_or_not.
messages
is an empty array on every new round - but yeah true it shouldn't clear it
I think its a timing issue (the bot will chew through the backlog at some point and as long as the user is seeing Round [X] screen, they are not typing anything, which gives the bot time to go through the queue.
I will just try adding a timeout to the typing function and not let the user type too fast?
from bot_or_not.
is there not another way to clear it? the user timeout typing would have to be pretty long to prevent this from happening
from bot_or_not.
I will look into both, but I think its more important to not put the bot in that position to begin with, rather than clear the queue.
from bot_or_not.
ya that's true, maybe a combination would be good.
from bot_or_not.
can you explain whats happening here?
from bot_or_not.
yes: I think I should move this into text processing though.
it sends this to the text processing middleware, and checks to see if the parser returns a reply (e.g., a bot response that doesn't need to be sent to dialogflow). If there's nothing back from the preprocessor, then the sample is sent to dialogflow
from bot_or_not.
the context line is a nasty hack (it's actually a bit less nasty in the line i just pushed), at some point we should talk about what your philosophy with the 'state' object is. basically for now it tells the preprocessor whether to activate certain checks or not.
from bot_or_not.
is it possible the callbacks introduce some confusion/time lag? looking at the await
mostly
from bot_or_not.
where are you noticing the lag? I'll try adding some timers
from bot_or_not.
I think what happens when I type fast is that the UI is displaying my messages, but it doesn't reach the bot as fast as it gets displayed - hence, the bot queue growing. Once I leave some time for the backend to catch up, then that queue is being reduced. That's why I thought that callbacks and awaits
could be behind this.
Basically: I'm not sure what regulates how quickly my (from the UI) text reaches the bot, how long it spends there, and how it comes back to the frontend. I thought that the lines I posted above are doing that ?
from bot_or_not.
I think taking a look at what is happening in Network will be useful - not sure what the correct order should be?
from bot_or_not.
ok i think i know what's going on:
when preprocessor runs in non-truth-challenge mode, it takes around 3ms to run: so that’s not the issue
when it parses and returns something without setting context, around 10ms. this is also pretty speedy -- and remember, after that there's no call to the bot (making this the quickest response in terms of total time)
the thing that takes the time is when it sets a context for the bot. this is a callback that takes up to 500ms to return (average about 100-150), so by the time it has returned, then a sample sent to the bot, it can take a long time.
unfortunately the current way this works means that, a context gets set, then the bot handles the callback. so we couldn’t just set the context then run the response.
BUT — what we could do is move those decisions off the bot entirely. if we’re setting the context in response to user input anyway, we probably know what the intent is: we’re actually using this as a proxy. so, instead of doing that, we can just randomise a response from the bot (even set a context based on response instead!!), which should speed stuff up massively.
the reason you were seeing such a big slowdown, that I hadn’t really noticed when I was debugging the bot queue, is that the only bot context where this currently happens (truth-challenge-bot) was set to the default bot in stateHelpers. In particular, there's a context that gets set when you don't ask the bot a question in the initial interaction. This is now changed.
tl;dr — currently there are some requests where there’s 2 callbacks to the bot; one to set context, and one to get a response. this can be gotten rid of! things will get faster!
from bot_or_not.
Related Issues (20)
- since there is an opponent name, make the bot use that if asked "what is your name" ? HOT 5
- last screen - the header shouldn't have a countdown HOT 1
- advance the dialogue format from single<-->single text to multiple <--> single or multiple <---> multiple HOT 3
- message queue not being cleared properly HOT 3
- No 'wait' screen before final truth round HOT 5
- player name gets set to whatever the last message they sent was
- hook Intro bot into Free Chat component HOT 1
- breakdown Intro flow into more screens HOT 1
- little pauses before bot reveal, and before player is matched
- (potential) ability to make rounds longer than 1min
- better 'filler' words
- better context buffer
- timer not clearing properly before start of next round HOT 1
- ...some screens get skipped mysteriously HOT 2
- move nlp to it's own file
- add 'data disclosure' screen at the start HOT 2
- add link to 'about' at the end of the interaction
- 'typing' dots being displaced inwards HOT 11
- remove "first up: Truth" screen? HOT 3
- Add blacklisted words
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from bot_or_not.