Giter Site home page Giter Site logo

Comments (8)

whitead avatar whitead commented on July 28, 2024

Try setting key_filter = False when you run query

from paper-qa.

maspotts avatar maspotts commented on July 28, 2024

Thanks: tried it but got same error. Is this (specifying a HuggingFaceHub() model) expected to work? Or am I trying something that's not officially supported? I'm not yet familiar enough with langchain to know whether what I'm trying to do makes sense!

from paper-qa.

whitead avatar whitead commented on July 28, 2024

The "error" you're showing is just a warning. Are you sure there is not another exception?

from paper-qa.

maspotts avatar maspotts commented on July 28, 2024

Ah, good point: I may have jumped the gun! I had assumed something had gone fatally wrong because the process hangs after emitting the warning. I've started it again and left it running; will see if it's completed or errored out or still hanging in the morning...

from paper-qa.

maspotts avatar maspotts commented on July 28, 2024

OK here's the final exception (actually two exceptions?):

NotImplementedError: Async generation not implemented for this LLM.
ValueError: Error raised by inference API: Model mosaicml/mpt-7b-chat time out

/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/chains/llm.py:110: RuntimeWarning: coroutine 'AsyncRunManager.on_text' was never awaited
  run_manager.on_text(_text, end="\n", verbose=self.verbose)
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
Traceback (most recent call last):
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/paperqa/qaprompts.py", line 91, in agenerate
    return await super().agenerate(input_list, run_manager=run_manager)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/chains/llm.py", line 90, in agenerate
    return await self.llm.agenerate_prompt(
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 136, in agenerate_prompt
    return await self.agenerate(prompt_strings, stop=stop, callbacks=callbacks)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 266, in agenerate
    raise e
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 258, in agenerate
    await self._agenerate(
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 395, in _agenerate
    await self._acall(prompt, stop=stop, run_manager=run_manager)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 363, in _acall
    raise NotImplementedError("Async generation not implemented for this LLM.")
NotImplementedError: Async generation not implemented for this LLM.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/mike/src/chatbot/./chatbot", line 3187, in <module>
    answer = bot.query(args.query)
  File "/Users/mike/src/chatbot/./chatbot", line 1388, in query
    response = index.query(input_text, k = k, max_sources = max_sources)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/paperqa/docs.py", line 378, in query
    return loop.run_until_complete(
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/asyncio/base_events.py", line 646, in run_until_complete
    return future.result()
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/paperqa/docs.py", line 412, in aquery
    answer = await self.aget_evidence(
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/paperqa/docs.py", line 311, in aget_evidence
    results = await asyncio.gather(*[process(doc) for doc in docs])
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/paperqa/docs.py", line 298, in process
    context=await summary_chain.arun(
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 262, in arun
    return (await self.acall(kwargs, callbacks=callbacks))[self.output_keys[0]]
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 180, in acall
    raise e
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 174, in acall
    await self._acall(inputs, run_manager=run_manager)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/chains/llm.py", line 195, in _acall
    response = await self.agenerate([inputs], run_manager=run_manager)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/paperqa/qaprompts.py", line 93, in agenerate
    return self.generate(input_list, run_manager=run_manager)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/chains/llm.py", line 79, in generate
    return self.llm.generate_prompt(
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 127, in generate_prompt
    return self.generate(prompt_strings, stop=stop, callbacks=callbacks)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 199, in generate
    raise e
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 193, in generate
    self._generate(missing_prompts, stop=stop, run_manager=run_manager)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/base.py", line 377, in _generate
    self._call(prompt, stop=stop, run_manager=run_manager)
  File "/Users/mike/.pyenv/versions/3.10.7/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/langchain/llms/huggingface_hub.py", line 111, in _call
    raise ValueError(f"Error raised by inference API: {response['error']}")
ValueError: Error raised by inference API: Model mosaicml/mpt-7b-chat time out

from paper-qa.

whitead avatar whitead commented on July 28, 2024

Looks like the model timed out @maspotts - the open source models are a lot slower and paper-qa requires a lot of calls. I would recommend Claude or HuggingFace with a GPU instance if you're look to replace the OpenAI models. Llama can work too, but it's extremely slow

from paper-qa.

maspotts avatar maspotts commented on July 28, 2024

Thanks: (off-topic but) do I need to upgrade to a paid huggingface tier to get sufficient performance?

from paper-qa.

whitead avatar whitead commented on July 28, 2024

Not sure, I have paid huggingface and I'm not sure what is available on free tier

from paper-qa.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.