This is an example AI app with assistants based on phidata. This AI app is built with Streamlit, PgVector, and local LLMs using ollama. You can run it locally using Docker.
- Clone the git repo
from the
ai-app
dir:
- Create + activate a virtual env:
python3 -m venv venv
source ./venv/bin/activate
- Install dependencies:
./scripts/install.sh
- Setup workspace:
phi ws setup
- upgraded
phidata
:
pip install -U phidata
- Start the workspace using:
phi ws up
- Pull the llama3 and nomic-embed-text models (run this command only once during the initial workspace setup):
scripts/pull_ollama_models.sh
- Open localhost:8501 to view the Streamlit App.
- Stop the workspace using:
phi ws down
Build the development image using:
phi ws up --env dev --infra docker --type image
To force rebuild images, use the --force or -f flag
phi ws up --env dev --infra docker --type image --force
Restart all docker containers using:
phi ws restart --env dev --infra docker --type container