Giter Site home page Giter Site logo

ai-in-hand / platform Goto Github PK

View Code? Open in Web Editor NEW
11.0 5.0 6.0 2.29 MB

AI Agent Automation Platform: Rapidly prototype, test, and deploy Multi-Agent Systems from your browser.

Home Page: https://platform.ainhand.com

License: GNU Affero General Public License v3.0

Python 60.75% Procfile 0.01% JavaScript 0.32% TypeScript 37.45% CSS 1.46%
ai ai-agents ai-agents-framework genai workflow-automation

platform's Issues

Fix Updating Assistants

Updating assistants has stopped working. We need to fix this issue. Review the PUT /agent endpoint code and the Agency-Swarm code to locate the bug.

Using Azure API?

Hey, do you think it's possible to use Azure API? I can't find the file where the call request to GPT is done. I'd like to try to modify that to call azure instead of Openai. Thanks a lot

Implement External Application Integration via Platform API

  • Develop API endpoints to allow external apps to integrate with our platform (generate API key (use Firebase Auth custom tokens), create new session (same as POST /session), send message - the same as POST /message but accepting an API key).
  • Implement API key management for secure and controlled access.
  • Create detailed API documentation and guidelines for third-party developers.

Stabilize the Chatbox

potential areas of improvement for the ChatBox component:

  • WebSocket Message Handling: Ensure robust handling of empty or malformed messages.
  • State Update on WebSocket Response: Use a more reliable method to update the state to avoid race conditions.
  • Error Handling: Clear error states appropriately after successful message transactions.
  • UI Responsiveness: Update chatMaxHeight dynamically on window resize to maintain layout integrity.
  • Message Parsing: Add null checks before parsing message metadata to prevent exceptions.
  • Text Area Handling: Prevent sending empty messages by checking the text area content before submission.
  • Component Cleanup: Implement a cleanup function for the WebSocket connection on component unmount to prevent memory leaks.

Open-Source Release TODO Checklist

  • Finish backend-frontend integration

    • Align skills type with backend config model
    • Align agent type with backend config model
    • Align agency type with backend config model
    • Integrate /session and /message endpoints
    • Stabilization and bug fix
  • Enhance Documentation Across Modules and Functions

    • Document every function, class, and module with clear and detailed descriptions.
    • Ensure docstrings cover parameters, return types, exceptions, and include inline comments for complex logic.
  • Implement Unit and Integration Tests

    • Achieve 90% test coverage with consistent and easily maintainable tests.
    • Achieve 95% test coverage with consistent and easily maintainable tests.
    • Achieve 100% test coverage with consistent and easily maintainable tests.
  • Address TODOs

    • Address and resolve existing TODO and FIXME comments in the codebase.
  • Refactor dependencies to services for Modularity

    • Move and identify business logic from dependencies to services.
    • Create corresponding service classes in services directory.
    • Update references in other modules to use these services, ensuring modularity and minimal disruption.
  • Develop Robust Error Handling in API Routes

    • Standardize error responses across all API endpoints with appropriate HTTP status codes.
    • Implement user-friendly error messages and log errors for internal monitoring.
    • Use try-catch blocks and ensure errors do not expose sensitive information.
  • Strengthen Authentication and Authorization

    • Review all API endpoints for proper authentication and authorization checks.
    • Implement role-based access control where necessary.
    • Ensure sensitive data is securely handled in transit and at rest.
  • Secure Sensitive Data Handling and Input Validation

    • Audit and implement comprehensive input validation at all API entry points.
    • Securely handle sensitive data like API keys and user credentials.
    • Encrypt sensitive data in transit and at rest.
  • Review and Enhance Configuration Management

    • Audit and review current configuration management, especially for sensitive data.
    • Implement environment-specific configurations and securely manage API keys and sensitive parameters.
    • Update and centralize configuration documentation.

Refactor: remove persistence layer dependency from the application layer

Remove storage dependencies from some endpoints. Move the logic to the service layer (managers).
Endpoints to be updated:

  • get("/agency")
  • delete("/agency")
  • get("/agent")
  • post("/skill/approve")
Checklist
  • Modify backend/routers/api/v1/agency.py788b97e Edit
  • Modify backend/routers/api/v1/agency.py788b97e Edit
  • Modify backend/routers/api/v1/agent.py788b97e Edit
  • Modify backend/routers/api/v1/skill.py788b97e Edit
  • Modify backend/services/agency_manager.py788b97e Edit
  • Modify backend/services/agency_manager.py788b97e Edit
  • Modify backend/services/agent_manager.py788b97e Edit
  • Modify backend/services/skill_manager.py788b97e Edit

Use S3 storage instead of /data folder

In the WriteAndSaveProgram skill, we utilize the agency_id to construct a directory path for each agency. However, this is a bad practice and we need a persistent solution. We should establish an external storage solution for each user. Using S3-compliant storage and creating a bucket would be a viable approach.

To integrate S3 storage into our FastAPI application, we can follow these steps:

  1. [us] Choose an S3 Storage Provider: Since we've deployed the app on Heroku, supporting Amazon S3 for storage would be a straightforward choice due to its broad support, scalability, and integration with Heroku.

  2. [end user] Set Up the S3 Bucket: Set up an S3 bucket in our preferred region. Ensure the bucket policy and IAM roles are set up accordingly to allow programmatic access.

  3. [us] Install Boto3: Boto3 is the Amazon Web Services (AWS) SDK for Python, which allows Python developers to write software that uses services like Amazon S3.

  4. [end user] Configure Access Credentials: End users need to securely store their access credentials (such as AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY) in user variables.

  5. [us] Modify the Current File Handling Logic: We currently have the WriteAndSaveProgram skill in /backend/custom_skills/write_and_save_program.py that manages file writing locally. We'll replace the logic in the File class's run method with code to upload files to S3. Here's a basic example of how we might upload files to S3:

import boto3
from botocore.exceptions import NoCredentialsError

def upload_to_s3(bucket_name, file_name, body):
    s3 = boto3.client('s3')
    try:
        s3.put_object(Bucket=bucket_name, Key=file_name, Body=body)
        print(f"File {file_name} uploaded to {bucket_name}")
    except NoCredentialsError:
        print("Credentials not available")

# In the File class's run method, replace the file writing logic with:
upload_to_s3("bucket_name", self.file_name, self.body)
  1. [us] Test the Integration: Before deploying the changes to production, ensure we've tested the S3 integration locally and on a staging environment to confirm that files are being uploaded correctly, and our application can retrieve them as needed.

  2. [us] Update Documentation: Reflect these changes in both the main README.md and the backend's README.md, specifying the requirement for S3 storage and environment variables setup.

Use this branch:
https://github.com/AI-in-Hand/platform/tree/feat/use_s3_storage_instead_of_data_folder
Need to apply the new approach to all the skills that read/write to the local drive.

POST /message endpoint timeout

Fix a bug when after ~30 seconds of waiting the message is never returned in the UI.
After 30 seconds, set an interval to poll the get /message/list endpoint for the messages

Implement get_current_user caching in Redis with 5 min expiration

Set a key with the user ID in Redis when we authenticate a user. Add caching to the get_current_user function located in backend/dependencies/auth.py. The get_current_user is used in the endpoints, e.g. here: backend/routers/api/v1/user.py.

Use RedisCacheManager from backend/services/redis_cache_manager.py (may use get_redis_cache_manager from backend/dependencies/dependencies.py).

If the key is present in Redis, assume the user is authorized and don't send a request to Firebase.

in backend/dependencies/auth.py:
use token as key. Don't extract user_id from token. Use user.model_dump() to convert user into dictionary.
Also, update the tests in `tests/unit/dependencies/test_auth.py

Implement Feedback Mechanism

Add a button that opens a modal for submitting user feedback. Alternatively, set up a chatbot widget that includes all information about the platform functionality and connect it to AirTable or Slack.

Deploy to AWS

I am planning to deploy this to AWS and need a high-level architecture diagram to guide the backend deployment for internal use. Is there such a diagram available?

Sweep: Use S3 storage instead of /data folder

In the WriteAndSaveProgram skill, instead of using the agency_id to construct a directory path for each agency we should access S3 storage with AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY user variables.
How to use user variables in the skill:

from backend.repositories.user_variable_storage import UserVariableStorage
from backend.services.user_variable_manager import UserVariableManager
user_variable_manager = UserVariableManager(UserVariableStorage())
airtable_base_id = user_variable_manager.get_by_key("AIRTABLE_BASE_ID")

To integrate S3 storage into our FastAPI application, we can follow these steps:

  1. an S3 Storage Provider: Amazon S3
  2. Install Boto3: Boto3 is the Amazon Web Services (AWS) SDK for Python, which allows Python developers to write software that uses services like Amazon S3.
  3. extract credentials from the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY user variables.
  4. Modify the Current File Handling Logic: We currently have the WriteAndSaveProgram skill in backend/custom_skills/write_and_save_program.py that manages file writing locally. Replace the logic in the File class's run method with code to upload files to S3. Here's a basic example of how we might upload files to S3:
import boto3
from botocore.exceptions import NoCredentialsError

def upload_to_s3(bucket_name, file_name, body):
    s3 = boto3.client('s3')
    try:
        s3.put_object(Bucket=bucket_name, Key=file_name, Body=body)
        print(f"File {file_name} uploaded to {bucket_name}")
    except NoCredentialsError:
        print("Credentials not available")

# In the File class's run method, replace the file writing logic with:
upload_to_s3("bucket_name", self.file_name, self.body)

In the backend/custom_skills/write_and_save_program.py file, for bucket_name use user_variable_manager.get_by_key("AWS_BUCKET_NAME")
Also update the tests in tests/unit/custom_skills/test_write_and_save_program.py to be up-to-date with the updated code.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.