This Local-LLM project will give you the ability to host your own LLM on your machine locally.
You can use it with LM Studio or OpenAI *.
For use with OpenAI you need edit the code and add your own OpenAI API key.
If you have questions or optimization-suggestions you can writ it down in Issues or Pull request section.
conda create -n local-llm python=3.10
- Enable the environment
conda activate local-llm
- Install the requirements
pip install -r requirements.txt
- Run the app via streamlit
streamlit run app.py
- Now the streamlit server is started and the output looks like this:
You can now view your Streamlit app in your browser.
Local URL: http://localhost:8501
Network URL: http://192.168.178.176:8501