Comments (6)
Hello, I cannot find anywhere in the code where the property option
is passed to the server are you sure it's not something else?
from twinny.
hmm, not really sure where else it would be coming from. At the time i collected these logs, there shouldn't have been anything else hitting the ollama api. Seeing as it seems to be working in most cases, i think we can close this out, thanks for looking into it.
from twinny.
Running into a similar issue:
[GIN] 2024/06/01 - 14:47:55 | 200 | 639.312645ms | 172.17.0.1 | POST "/v1/chat/completions"
time=2024-06-01T14:47:59.879Z level=WARN source=types.go:384 msg="invalid option provided" option=""
from twinny.
I get same error via manual POST via python requests lib for /api/generate. I pass options
as a dictionary. weird.
from twinny.
Has anyone figured this out? Happens to me too. This is the dictionary that's sent:
data = {
"model": model,
"messages": [{
"role": "user",
"content": message
}],
"stream": False,
"options": {
"max_tokens": max_tokens,
"temperature": temperature
}
}
Both max_tokens
and temperature
are never None
This is the log entry:
time=2024-06-25T20:13:55.877Z level=WARN source=types.go:430 msg="invalid option provided" option=""
[GIN] 2024/06/25 - 20:13:56 | 400 | 130.044154ms | 172.16.16.35 | POST "/api/chat"
from twinny.
Related Issues (20)
- FIM doesn't work with Keep Alive = -1 HOT 1
- Cannot read long model names when configuring provider HOT 1
- Option to save provider configuration to disk HOT 1
- Codeqwen uses same FIM template as stable-code HOT 3
- Context Length Option With File Context Enabled Doesn't Limit Length HOT 1
- Chat workspace on right of visual code HOT 1
- Add OpenAI provider HOT 7
- FIM completion flexible context HOT 2
- Feature Request: Addition of a Visual Studio Plugin for Twinny HOT 1
- Sidebar shortcut/binding doesn t focus/unfocus properly
- Feature Request: newline shortcut in the prompt bindable HOT 5
- Interface lag after a number of messages in chat HOT 1
- Document RAG HOT 2
- Add support for VSCode 1.70.x again please HOT 3
- undefined with deepseek-lite-ggml and llama.cpp HOT 1
- Feature request: Customize template to shown in context menu HOT 1
- Anything after <file_sep> to be stripped from completion HOT 8
- Model name not settable HOT 1
- Supports multi-line text input HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from twinny.