Use Embedded Vector Store and an LLM to chat with your Confluence docs
โจ Only need a Confluence Account and OpenAI account.
Uses (all via Langchain):
- Confluence API
- Hugging Face
- ChromaDB
- OpenAI
- Create a local embedding vector store of the Confluence docs
- use the embedding vector store to get relevant texts from the store
- provide the llm with the received texts + question as a prompt to answer the question
- Confluence API Key
- OpenAI account and API Key
cp .env.example .env
Set the values (don't worry, this file is in .gitignore
)
python3 -m venv pyenv
source pyenv/bin/activate
python3 -m pip install -r requirements.txt
Use the process
subcommand
Your confluence space key can be found in it's URL like https://example.atlassian.net/wiki/spaces/<space-key>
python3 src/main.py --confluence-space-key ABC process
Note that subsequent runs with the same arguments will replace the previous vector store for that confluence space
To start the chat application using an existing dataset, use the chat
subcommand:
python3 src/main.py --confluence-space-key ABC chat
The Streamlit chat app will run, and you can interact with the chatbot at http://localhost:8501
(or the next available port) to ask questions about the repository.
deactivate
For complete CLI instructions run
python src/main.py --help
The overall structure was pulled from peterw/Chat-With-Github-Repo