Giter Site home page Giter Site logo

lightningralf / chainfury Goto Github PK

View Code? Open in Web Editor NEW

This project forked from nimbleboxai/chainfury

0.0 0.0 0.0 4.79 MB

πŸ¦‹ Simplify the creation and management of prompt chains. Build complex chat applications using LLMs with 4 clicks ⚑️

Home Page: https://chainfury.nbox.ai

License: MIT License

Shell 0.44% JavaScript 6.42% Python 29.21% TypeScript 62.51% CSS 0.83% HTML 0.20% Dockerfile 0.39%

chainfury's Introduction

πŸ¦‹ ChainFury

ChainFury is a powerful tool that simplifies the creation and management of chains of prompts, making it easier to build complex chat applications using LLMs. With a simple GUI inspired by LangFlow, ChainFury enables you to chain components of LangChain together, allowing you to embed more complex chat applications with a simple JS snippet.

You can try out ChainFury here.

Features

ChainFury supports a range of features, including:

  • Recording all prompts and responses and storing them in a database
  • Collecting metrics like latency to provide an easy-to-use scoring mechanism for end-users
  • Querying OpenAI's API to obtain a rating for the response, which it stores in the database.

Components

From the LangChain documentation, there are six main areas that LangChain is designed to help with. ChainFury consists of the same concepts to build LLM ChatBots. The components are, in increasing order of complexity:

Glossary LangChain ChainFury
πŸ“ƒ LLMs and Prompts Prompt management, prompt optimization, generic interface for all LLMs, and common utilities for working with LLMs Easy prompt management with GUI elements
πŸ”— Chains Chains are sequences of calls (whether to an LLM or a different utility). LangChain provides a standard interface for chains, lots of integrations with other tools, and end-to-end chains for common applications Easy chain management with GUI
πŸ“š Data Augmented Generation Data Augmented Generation involves specific types of chains that first interact with an external datasource to fetch data to use in the generation step. Examples of this include summarization of long pieces of text and question/answering over specific data sources Coming soon
πŸ€– Agents Agents involve an LLM making decisions about which Actions to take, taking that Action, seeing an Observation, and repeating that until done. LangChain provides a standard interface for agents, a selection of agents to choose from, and examples of end to end agents Easy agent management with GUI
🧠 Memory Memory is the concept of persisting state between calls of a chain/agent. LangChain provides a standard interface for memory, a collection of memory implementations, and examples of chains/agents that use memory Memory modules are supported, persistant memory coming soon
🧐 Evaluation [BETA] Generative models are notoriously hard to evaluate with traditional metrics. One new way of evaluating them is using language models themselves to do the evaluation. LangChain provides some prompts/chains for assisting in this Auto evaluation of all prompts though OpenAI APIs


Installing ChainFury is easy, with two methods available.

Method 1: Docker

The easiest way to install ChainFury is to use Docker. You can use the following command to run ChainFury:

docker build . -f Dockerfile -t chainfury:latest

docker run --env OPENAI_API_KEY=<your_key_here> -p 8000:8000 chainfury:latest

You can also pass a Database URL to the docker container using the DATABASE_URL environment variable. If you do not pass a database URL, ChainFury will use a SQLite database.

Example:

docker run -it -E DATABASE_URL="mysql+pymysql://<user>:<password>@127.0.0.1:3306/<database>" -p 8000:8000 chainfury

Now you can access the app on localhost:8000.

Method 2: Manual

For this, you will need to build the frontend and and then run the backend. The frontend can be built using the following command:

cd client
yarn install
yarn build

To copy the frontend to the backend, run the following command:

cd ..
cp -r client/dist/ server/static/
mkdir -p ./server/templates
cp ./client/dist/index.html ./server/templates/index.html

Now you can install the backend dependencies and run the server. We recommend using Python 3.9 virtual environment for this:

python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
cd server
python3 -m uvicorn app:app --log-level=debug --host 0.0.0.0 --port 8000 --workers 1

Now you can access the app on localhost:8000.



  1. Start the server by using the docker file provided or by using the manual method.

  2. Log into ChainFury by entering username = β€œadmin” and password = β€œadmin”

  3. Click on create chatbot

  4. Use one of the pre-configured chatbots or use the elements to create a custom chatbot.

  5. Save & create your chatbot and start chatting with it by clicking the chat on the bottom-right. You can see chatbot statistics and feedback metrics in your ChainFury dashboard.



ChainFury is a work in progress, and is currently in the alpha stage. Please feel free to contribute to the project in any form!

chainfury's People

Contributors

chanh-1 avatar priya314 avatar priyasridharandav avatar rohanpooniwala avatar vgulerianb avatar yashbonde avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.