Comments (16)
Wait....
IT WORKS ... WHEN I REMOVE THE STREAM OPTION FROM THE MODEL
from dialoqbase.
Would it be possible to call the ollama embedding endpoint?
https://github.com/ollama/ollama/blob/main/docs/api.md#generate-embeddings
This would allow for the offloading to a remote ollama node running on a GPU.
Thanks!
from dialoqbase.
I don't understand ... there are always new error from somewhere @_@
In the docker console:
Error: Request to Ollama server failed: 404 Not Found
at OllamaEmbeddings._request (/app/node_modules/@langchain/community/dist/embeddings/ollama.cjs:101:19)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async RetryOperation._fn (/app/node_modules/p-retry/index.js:50:12) {
attemptNumber: 7,
retriesLeft: 0
}<
It seems to me that the web interface is kinda independent from the config, or the docker.
Because now the Web UI accept the model and embedding settings, but the dont work on the backend (docker).
Example of what I am saying: on the Web page in the config the address ONLY "http://host.docker.internal:11434" is accepted, but it that doesn't work when the docker image tries to get my ollama (I forgot to mention it is on WSL, not docker)
So all in all it seems like the local host IP network is conflicting the code. It does it with LiteLLM the same way Dialoqbase.
Somehow you (also LiteLLM) are reffering to the local host in specific way, because of tens of other projects work with the current setup.
One of them is Ollama web UI - using docker and it finds and works with ollama on "http://host.docker.internal:11434"
Another tip: My host IPs are a complicated story... not sure if from the WSL of from the Docker... For that is what confuses the dialoqbase.
Now I am trying to figure out the docker network... ... ...
from dialoqbase.
IDK
from dialoqbase.
ollama version is 0.1.20
from dialoqbase.
Hey @mjtechguy, I have released the bug fixes that will resolve your issues. :)
from dialoqbase.
Hey, the error is because of the ollama embedding
you selected, which is deprecated. You can add Ollama embedding models the same way you added the Ollama chat model by going to 'Models' and selecting 'Embedding'
from dialoqbase.
Hey, the error is because of the
ollama embedding
you selected, which is deprecated. You can add Ollama embedding models the same way you added the Ollama chat model by going to 'Models' and selecting 'Embedding'
And it seem to pass that problem.
But then we have a NEW message :)
That means it doesn't update the config...
I tried manually, but I don't know how it will work:
Gives the same NEW error.
PLEASE HELP!
BTW it doesn't work with localhost, 127.0.0.1... it finds the ollama server with host.docker.internal. Maybe that is related IDK
from dialoqbase.
Gives the same NEW error.
Hey, sorry about the error. What error is it throwing now?
from dialoqbase.
In the docker console:
Interesting that you added the Ollama chat model via the editor, but are you still experiencing the issue, right?
If you successfully added the Ollama model, then there is no problem with the chat model. The issue is caused by the embedding model you configured. Can you double-check the model ID?
from dialoqbase.
Hey, I think I found the issue not related to Docker (just tested it on my Windows WSL) too.
Can you try using zephyr:7b-beta
as the chat model along with any other embedding model (use Xenova/all-Min-L6-v2
) and send a message? It should work.
There will be a new release in 2 or 3 hours that fixes the above issues (hopefully).
Also, is http://host.docker.internal:11434 working perfectly
from dialoqbase.
http://host.docker.internal:11434/ is the only way that HALF works for me... I don't know if that will be for everyone :D
from dialoqbase.
That's a strange bug. I will look into it. Which version of Ollama are you using?
Also, do not remove {question}
from the prompt.
from dialoqbase.
I tested Ollama mainly on Llama2 and Mistral; both of them work fine for me. I released a few Ollama-related fixes; you can pull the latest in 1 or 2 hours.
from dialoqbase.
Try another model you have locally to see if it's working fine. :)
from dialoqbase.
Yes, You can configure Ollama embedding like chat models by following these steps:
- Go to admin models.
- Select 'embedding.'
- Add the Ollama URL and choose the model from which you want to generate embedding. (The UI will be enhanced in the coming version.)
Please let me know if you encounter any issues.
from dialoqbase.
Related Issues (20)
- Bug: Crawler scrapes page anchors as new pages HOT 1
- Problems after update to 1.7 HOT 5
- How to delete. HOT 1
- Installation on Vercel.com HOT 1
- Add voice response in embedded chatbot
- Railway database issue
- context window to 16K HOT 10
- suggestion: To make the Bot chat ui similar to the playground HOT 1
- Ollama Embedding timeout
- How to Set Up Nginx with Let's Encrypt SSL for Docker Containers on Debian and AlmaLinux HOT 2
- please suggest a method or library statistics of traffic usage of chat bots HOT 3
- Problem with sources. HOT 1
- Released a new GPT-3.5 Turbo model (‘[gpt-3.5-turbo-0125] open ai HOT 2
- Error - replicate, model mistralai/mixtral-8x7b-instruct-v0.1 HOT 2
- Issue with the response completion
- [error] Chat Widget - White Page in Development HOT 8
- OpenRouter support for language and embedding models HOT 2
- Data Source stuck at pending state (source type: Website) HOT 2
- No createdAt - HELP! HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dialoqbase.