Giter Site home page Giter Site logo

promptengineers-ai / llm-server Goto Github PK

View Code? Open in Web Editor NEW
12.0 1.0 5.0 27.66 MB

๐Ÿค– Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG

Home Page: https://promptengineersai.netlify.app

License: MIT License

Python 46.23% Shell 1.01% Mako 0.17% JavaScript 0.17% TypeScript 51.87% CSS 0.40% Dockerfile 0.16%
ai bedrock fastapi langchain large-language-models openai python server vercel anthropic groq minio ollama redis

llm-server's Introduction

๐Ÿค– Prompt Engineers AI - LLM Server

Full LLM REST API with prompts, LLMs, Vector Databases, and Agents

๐Ÿ› ๏ธ Setup API Server

### Change into Backend directory
cd backend

### Setup Virtual Env
python3 -m venv .venv

### Activate Virtual Env
source .venv/bin/activate

### Install Runtime & Dev Dependencies
pip install -r requirements.txt -r requirements-dev.txt

### Install Runtime Dependencies
pip install -r requirements.txt

### Migrate Database Schema
alembic upgrade head

### Seed Database Users
python3 -m src.seeds.users 3

### Run Application on local machine
bash scripts/dev.sh

๐Ÿ› ๏ธ Setup Client

### Change into Backend directory
cd frontend

### Install node_modules
npm install

### Start Development Server
npm run dev

Environment Variables

Variable Name Example Description
APP_ENV 'local' Environment where the application is running
PINECONE_API_KEY API key for Pinecone services
PINECONE_ENV us-east1-gcp Pinecone environment configuration
PINECONE_INDEX default Default Pinecone index used
OPENAI_API_KEY sk-abc123... Default LLM OpenAI key
GROQ_API_KEY API key for accessing GROQ services
ANTHROPIC_API_KEY API key for accessing Anthropic services
MINIO_HOST localhost:9000 URL to the Object storage
BUCKET my-documents Name of Minio or S3 bucket
ACCESS_KEY_ID AKIAIOSFODNN7EXAMPLE IAM User Access Key ID (Optional)
ACCESS_SECRET_KEY wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY Secret IAM Key (Optional)
S3_REGION us-east-1 Region where the S3 bucket exists (Optional)

Deploy

  1. Log in to vercel
vercel login
  1. Deploy to vercel
vercel .

๐Ÿš€ Roadmap

Here are the upcoming features we're excited to bring to Prompt Engineers AI - LLM Server (More to come):

  • ๐Ÿ›  UI-Based Tool Configuration
  • ๐Ÿ–ฅ Code Interpreter
  • ๐Ÿค– Assistant Creation Capability

Create an issue and lets start a discussion if you'd like to see a feature added to the roadmap.

๐Ÿค How to Contribute

We welcome contributions from the community, from beginners to seasoned developers. Here's how you can contribute:

  1. Fork the repository: Click on the 'Fork' button at the top right corner of the repository page on GitHub.

  2. Clone the forked repository to your local machine: git clone <forked_repo_link>.

  3. Navigate to the project folder: cd llm-server.

  4. Create a new branch for your changes: git checkout -b <branch_name>.

  5. Make your changes in the new branch.

  6. Commit your changes: git commit -am 'Add some feature'.

  7. Push to the branch: git push origin <branch_name>.

  8. Open a Pull Request: Go back to your forked repository on GitHub and click on 'Compare & pull request' to create a new pull request.

Please ensure that your code passes all the tests and if possible, add tests for new features. Always write a clear and concise commit message and pull request description.

๐Ÿ’ก Issues

Feel free to submit issues and enhancement requests. We're always looking for feedback and suggestions.

๐Ÿค“ Maintainers

๐Ÿ“œ License

This project is open-source, under the MIT License. Feel free to use, modify, and distribute the code as you please.

Happy Prompting! ๐ŸŽ‰๐ŸŽ‰

llm-server's People

Contributors

ryaneggz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

llm-server's Issues

Groq RAG is not working

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 265, in __call__
    await wrap(partial(self.listen_for_disconnect, receive))
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 261, in wrap
    await func()
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 238, in listen_for_disconnect
    message = await receive()
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 568, in receive
    await self.message_event.wait()
  File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
    await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 7f9ca4604a00

During handling of the above exception, another exception occurred:

  + Exception Group Traceback (most recent call last):
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi
  |     result = await app(  # type: ignore[func-returns-value]
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
  |     return await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
  |     await super().__call__(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
  |     await self.app(scope, receive, _send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 93, in __call__
  |     await self.simple_response(scope, receive, send, request_headers=headers)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 148, in simple_response
  |     await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
  |     await route.handle(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
  |     await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 75, in app
  |     await response(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 258, in __call__
  |     async with anyio.create_task_group() as task_group:
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__
  |     raise BaseExceptionGroup(
  | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 261, in wrap
    |     await func()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 250, in stream_response
    |     async for chunk in self.body_iterator:
    |   File "/home/ryaneggz/promptengineers/llm-server/backend/src/utils/__init__.py", line 9, in chain_stream
    |     async for event in runnable:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4712, in astream
    |     async for item in self.bound.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4712, in astream
    |     async for item in self.bound.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2900, in astream
    |     async for chunk in self.atransform(input_aiter(), config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4748, in atransform
    |     async for item in self.bound.atransform(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 601, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 580, in _atransform
    |     async for chunk in for_passthrough:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 601, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 591, in _atransform
    |     yield await first_map_chunk_task
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 62, in anext_impl
    |     return await __anext__(iterator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3315, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3302, in _atransform
    |     chunk = AddableDict({step_name: task.result()})
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3285, in get_next_chunk
    |     return await py_anext(generator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4748, in atransform
    |     async for item in self.bound.atransform(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1334, in atransform
    |     async for output in self.astream(final, config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/branch.py", line 400, in astream
    |     async for chunk in self.default.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2900, in astream
    |     async for chunk in self.atransform(input_aiter(), config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1316, in atransform
    |     async for ichunk in input:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/output_parsers/transform.py", line 60, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1944, in _atransform_stream_with_config
    |     final_input: Optional[Input] = await py_anext(input_for_tracing, None)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 62, in anext_impl
    |     return await __anext__(iterator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1334, in atransform
    |     async for output in self.astream(final, config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 319, in astream
    |     raise e
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 297, in astream
    |     async for chunk in self._astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_groq/chat_models.py", line 384, in _astream
    |     async for chunk in await self.async_client.create(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/groq/resources/chat/completions.py", line 357, in create
    |     return await self._post(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/groq/_base_client.py", line 1711, in post
    |     return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/groq/_base_client.py", line 1427, in request
    |     return await self._request(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/groq/_base_client.py", line 1518, in _request
    |     raise self._make_status_error_from_response(err.response) from None
    | groq.BadRequestError: Error code: 400 - {'error': {'message': "'messages.1' : for 'role:system' the following must be satisfied[('messages.1.content' : value must be a string)]", 'type': 'invalid_request_error'}}

Anthropic RAG (Sonnet & Opus) not working. ValueError: System message must be at beginning of message list.

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 265, in __call__
    await wrap(partial(self.listen_for_disconnect, receive))
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 261, in wrap
    await func()
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 238, in listen_for_disconnect
    message = await receive()
  File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 568, in receive
    await self.message_event.wait()
  File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
    await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 7f0dc4518940

During handling of the above exception, another exception occurred:

  + Exception Group Traceback (most recent call last):
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi
  |     result = await app(  # type: ignore[func-returns-value]
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
  |     return await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
  |     await super().__call__(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
  |     await self.app(scope, receive, _send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 93, in __call__
  |     await self.simple_response(scope, receive, send, request_headers=headers)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 148, in simple_response
  |     await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
  |     await route.handle(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
  |     await self.app(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/routing.py", line 75, in app
  |     await response(scope, receive, send)
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 258, in __call__
  |     async with anyio.create_task_group() as task_group:
  |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__
  |     raise BaseExceptionGroup(
  | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 261, in wrap
    |     await func()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/starlette/responses.py", line 250, in stream_response
    |     async for chunk in self.body_iterator:
    |   File "/home/ryaneggz/promptengineers/llm-server/backend/src/utils/__init__.py", line 9, in chain_stream
    |     async for event in runnable:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4712, in astream
    |     async for item in self.bound.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4712, in astream
    |     async for item in self.bound.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2900, in astream
    |     async for chunk in self.atransform(input_aiter(), config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4748, in atransform
    |     async for item in self.bound.atransform(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 601, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 580, in _atransform
    |     async for chunk in for_passthrough:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 601, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/passthrough.py", line 591, in _atransform
    |     yield await first_map_chunk_task
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 62, in anext_impl
    |     return await __anext__(iterator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3315, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3302, in _atransform
    |     chunk = AddableDict({step_name: task.result()})
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3285, in get_next_chunk
    |     return await py_anext(generator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4748, in atransform
    |     async for item in self.bound.atransform(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1334, in atransform
    |     async for output in self.astream(final, config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/branch.py", line 400, in astream
    |     async for chunk in self.default.astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2900, in astream
    |     async for chunk in self.atransform(input_aiter(), config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2883, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1985, in _atransform_stream_with_config
    |     chunk = cast(Output, await py_anext(iterator))
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2853, in _atransform
    |     async for output in final_pipeline:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1316, in atransform
    |     async for ichunk in input:
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/output_parsers/transform.py", line 60, in atransform
    |     async for chunk in self._atransform_stream_with_config(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1944, in _atransform_stream_with_config
    |     final_input: Optional[Input] = await py_anext(input_for_tracing, None)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 62, in anext_impl
    |     return await __anext__(iterator)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer
    |     item = await iterator.__anext__()
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1334, in atransform
    |     async for output in self.astream(final, config, **kwargs):
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 319, in astream
    |     raise e
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 297, in astream
    |     async for chunk in self._astream(
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_anthropic/chat_models.py", line 440, in _astream
    |     params = self._format_params(messages=messages, stop=stop, **kwargs)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_anthropic/chat_models.py", line 378, in _format_params
    |     system, formatted_messages = _format_messages(messages)
    |   File "/home/ryaneggz/pxt/llm-server/backend/.venv/lib/python3.10/site-packages/langchain_anthropic/chat_models.py", line 152, in _format_messages
    |     raise ValueError("System message must be at beginning of message list.")
    | ValueError: System message must be at beginning of message list.
    +------------------------------------

Wire LLaMA to VectorStore Endpoints

We need to establish a connection between LLaMA and the VectorStore endpoints to enable seamless integration. This will allow for efficient querying and retrieval of vector data.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.