Giter Site home page Giter Site logo

Comments (22)

cosmicespresso avatar cosmicespresso commented on July 30, 2024 1

also re: StateMap - yes I will finalize that object as soon as we are done with UI updates, and then you can point to specific steps to bind to specific bots.

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

How to get the effect:
Screen Shot 2020-03-26 at 14 59 04

Result the following round:
Screen Shot 2020-03-26 at 14 59 17

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

I have tried replicating this but can't :/

UPDATE: I see the issue, but its not a problem of clearing the bot queue - it is reset on every round.

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

hmmm. ok I'll do some detective work and get back to you

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

Screen Shot 2020-04-02 at 12 24 00

hm -- bot queue is not getting cleared between rounds: here it is still full from the round before. where's it being cleared at the moment?

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

looking into it - for the moment I'm wondering if all the if statements can be handled more elegantly to avoid time lags?

image

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

oh yeah for sure, I'll look at that now

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

image

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

I think the clearing is happening on line 75 of App.js this.botQueue = this.botQueue.concat(messages);

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

hm, but concat doesn't clear anything?

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

messages is an empty array on every new round - but yeah true it shouldn't clear it
I think its a timing issue (the bot will chew through the backlog at some point and as long as the user is seeing Round [X] screen, they are not typing anything, which gives the bot time to go through the queue.

I will just try adding a timeout to the typing function and not let the user type too fast?

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

is there not another way to clear it? the user timeout typing would have to be pretty long to prevent this from happening

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

I will look into both, but I think its more important to not put the bot in that position to begin with, rather than clear the queue.

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

ya that's true, maybe a combination would be good.

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

image
can you explain whats happening here?

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

yes: I think I should move this into text processing though.
it sends this to the text processing middleware, and checks to see if the parser returns a reply (e.g., a bot response that doesn't need to be sent to dialogflow). If there's nothing back from the preprocessor, then the sample is sent to dialogflow

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

the context line is a nasty hack (it's actually a bit less nasty in the line i just pushed), at some point we should talk about what your philosophy with the 'state' object is. basically for now it tells the preprocessor whether to activate certain checks or not.

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

is it possible the callbacks introduce some confusion/time lag? looking at the await mostly

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

where are you noticing the lag? I'll try adding some timers

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

I think what happens when I type fast is that the UI is displaying my messages, but it doesn't reach the bot as fast as it gets displayed - hence, the bot queue growing. Once I leave some time for the backend to catch up, then that queue is being reduced. That's why I thought that callbacks and awaits could be behind this.

Basically: I'm not sure what regulates how quickly my (from the UI) text reaches the bot, how long it spends there, and how it comes back to the frontend. I thought that the lines I posted above are doing that ?

from bot_or_not.

cosmicespresso avatar cosmicespresso commented on July 30, 2024

I think taking a look at what is happening in Network will be useful - not sure what the correct order should be?
image

from bot_or_not.

agnescameron avatar agnescameron commented on July 30, 2024

ok i think i know what's going on:

when preprocessor runs in non-truth-challenge mode, it takes around 3ms to run: so that’s not the issue

when it parses and returns something without setting context, around 10ms. this is also pretty speedy -- and remember, after that there's no call to the bot (making this the quickest response in terms of total time)

the thing that takes the time is when it sets a context for the bot. this is a callback that takes up to 500ms to return (average about 100-150), so by the time it has returned, then a sample sent to the bot, it can take a long time.

unfortunately the current way this works means that, a context gets set, then the bot handles the callback. so we couldn’t just set the context then run the response.

BUT — what we could do is move those decisions off the bot entirely. if we’re setting the context in response to user input anyway, we probably know what the intent is: we’re actually using this as a proxy. so, instead of doing that, we can just randomise a response from the bot (even set a context based on response instead!!), which should speed stuff up massively.

the reason you were seeing such a big slowdown, that I hadn’t really noticed when I was debugging the bot queue, is that the only bot context where this currently happens (truth-challenge-bot) was set to the default bot in stateHelpers. In particular, there's a context that gets set when you don't ask the bot a question in the initial interaction. This is now changed.

tl;dr — currently there are some requests where there’s 2 callbacks to the bot; one to set context, and one to get a response. this can be gotten rid of! things will get faster!

from bot_or_not.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.