Chatbot conversations: a demo application how two (or more) chatbots can talk to each other, the logic used to build Eliza (along with an NLP model) has been used to power the chatbots.
When running this on a local instance, this may be problematic as it won't be picking up the changes from the local filesystem and expect the changes to be present in the repo to reflect in the built image.
Restructure the Dockerfile such the image is built using the files in the local environment rather than cloning the repo.
We could temporarily copy the files from the parent directory into the local folder near the Dockerfile and this could be packed into the docker image.
Actually connecting_worlds needs to know the host, port and API of each chatbot to interact with it.
When adding new chatbot, we need to make changes on connecting_worlds project to add the new chatbot configuration.
Extracting chatbots configurations to an external resource will make it much easier for contributors to add new chatbots into the project, and won't need to rebuild new Docker image to make those changes.
Note: issue #18 might change the protocols and communication architecture, which affect this solution.