Giter Site home page Giter Site logo

langchain-streamlit-template's Introduction

LangChain-Streamlit Template

This repo serves as a template for how to deploy a LangChain on Streamlit.

This repo contains an main.py file which has a template for a chatbot implementation.

Adding your chain

To add your chain, you need to change the load_chain function in main.py. Depending on the type of your chain, you may also need to change the inputs/outputs that occur later on.

Deploy on Streamlit

This is easily deployable on the Streamlit platform. Note that when setting up your StreamLit app you should make sure to add OPENAI_API_KEY as a secret environment variable.

langchain-streamlit-template's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

langchain-streamlit-template's Issues

use new streamlit features

The app can make use of the st.chat_messages() and st.chat_input() methods instead of the st.text_input() method.

How to stream LLM response with streamlit?

I am following this script using RetrievalQA chain.

Code:

llm = OpenAI(client=OpenAI, streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)
chain = RetrievalQA.from_chain_type(llm=llm, chain_type='refine', retriever=docsearch.as_retriever())

...

if 'user_input' not in st.session_state:
    st.session_state['user_input'] = []
if 'generated_text' not in st.session_state:
    st.session_state['generated_text'] = []

user_input = st.text_area('Enter a question', value=f"What are trends in {st.session_state['thematic']['term']}?")

button_2 = st.button('Get answer')

if user_input and button_2:
    st.session_state.user_input.append(user_input)
    with st.spinner('Running LLM...'):
        st.session_state.generated_text.append(st.session_state['chain'].run(user_input))

if 'generated_text' in st.session_state and len(st.session_state['generated_text']) > 0:
    for i in range(len(st.session_state['generated_text']) - 1, -1, -1):
        message(st.session_state['user_input'][i], is_user=True, key=str(i) + '_user')
        message(st.session_state['generated_text'][i], key=str(i))

How can I stream the response of the LLM in real time (like on the console)?

Memory is not getting stored in conversation flows in streamlit app

Screenshot 2023-03-15 at 9 30 06 PM

Verbose output

Screenshot 2023-03-15 at 9 30 49 PM

The history is not getting saved.
I defined the chain as follows

chain = ConversationChain(llm=llm,
                              verbose=True,
                              memory=ConversationBufferMemory())

I also tried using prompts and LLM chain

#prompt= prompt definition here
prompt = PromptTemplate(
    input_variables=["history", "human_input"], 
    template=template
)

chatgpt_chain = LLMChain(
    llm=llm, 
    prompt=prompt, 
    verbose=True, 
    memory=ConversationBufferWindowMemory(k=4),
)

output = chatgpt_chain.predict(human_input="Hello")

All of this works fine on notebooks but in streamlit app, the history is not getting passes in prompt

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.