A comprehensive collection of utility scripts and helper functions designed to streamline your work with various generative AI APIs, including OpenAI, Anthropic, and others. This repository provides reusable components and examples that make it easy to integrate and manage generative AI capabilities in your projects.
-
๐ ๏ธ Generative AI Utilities:
- Unified scripts for interacting with multiple generative AI APIs, such as OpenAI and Anthropic.
- Includes functions for managing files, chat interactions, and vector store operations.
- Customizable scripts for generating responses, processing prompts, and handling API-specific tasks.
-
๐ง Common Utilities:
- Environment management, logging, and request handling utilities that can be shared across different AI APIs.
- Custom CSS for enhancing the appearance of Streamlit applications, making your interfaces more user-friendly and visually appealing.
To get started with these utilities, clone the repository and install the necessary dependencies:
git clone https://github.com/yourusername/gen-ai-utils.git
cd gen-ai-utils
pip install -r requirements.txt
Make sure to set up your environment variables, particularly API keys for the generative AI services you're using. You can use a .env
file to securely manage these variables.
The repository is organized to support multiple AI APIs, with a clear separation of utilities and examples:
gen-ai-utils/
โ
โโโ app-streamlit.py # Example Streamlit app demonstrating usage
โโโ generative_ai/
โ โโโ openai/
โ โ โโโ batch-update-vector-store-files.ipynb # Jupyter notebook for OpenAI vector store batch updates
โ โ โโโ delete_vector_store_files.ipynb # Jupyter notebook for OpenAI vector store file deletion
โ โ โโโ openai_assistant_response.py # Script for generating responses using OpenAI Assistant API
โ โ โโโ openai_utils.py # Utility functions for interacting with OpenAI API
โ โ
โ โโโ anthropic/
โ โ โโโ anthropic_utils.py # Placeholder for Anthropic-specific utilities
โ โ
โ โโโ gemini/ # Placeholder for Gemini API utilities
โ โโโ llama/ # Placeholder for LLaMA API utilities
โ
โโโ utils/
โ โโโ custom_css_main_page.py # Custom CSS for the main page styling
โ โโโ custom_css_banner.py # Custom CSS for the banner
โ โโโ message_utils.py # Utility functions for formatting and displaying messages
โ
โโโ .gitignore # Git ignore file
โโโ LICENSE # License for the repository
โโโ README.md # This file
โโโ requirements.txt # Python dependencies
โโโ .env.example # Example environment variables file
To generate a response using a generative AI API (e.g., OpenAI):
from generative_ai.openai.openai_utils import generate_assistant_response
# Generate a response using OpenAI's Assistant API
response = generate_assistant_response("What's the weather today?", "your_assistant_id")
print(response)
For managing vector stores with OpenAI:
- Batch Update: Use
batch-update-vector-store-files.ipynb
to update multiple files in a vector store. - File Deletion: Use
delete_vector_store_files.ipynb
to delete files from a vector store.
Run the example Streamlit app app-streamlit.py
to see how these utilities can be integrated into a web application:
streamlit run app-streamlit.py
The app showcases chat interactions using OpenAI's API, enhanced with custom CSS and message formatting utilities.
- Custom CSS: Use
custom_css_main_page.py
andcustom_css_banner.py
to style your Streamlit apps. - Message Formatting: Use
message_utils.py
for consistent formatting and display of chat messages.
Contributions are welcome! Please refer to the CONTRIBUTING.md file for guidelines on how to contribute to this project.
This project is licensed under the MIT License - see the LICENSE file for details.
For any questions or suggestions, feel free to open an issue or contact the repository owner.