dexaai / dexter Goto Github PK
View Code? Open in Web Editor NEWLLM tools used in production at Dexa
Home Page: https://dexter.dexa.ai
License: MIT License
LLM tools used in production at Dexa
Home Page: https://dexter.dexa.ai
License: MIT License
The chunks for tool calls aren't properly managed by handleUpdate()
in ChatModel
.
When an AIRunner encounters an error, it will swallow it and retry until it hits the max iterations and then returns an error stating that the max iterations were hit, without ever exposing the underlying errors.
I had the following code in my codebase:
return createProxyForPromiseMethods(async () => ({
run: createAIRunner({
chatModel: await chatModel({
model: options.model ?? "gpt-3.5-turbo",
response_format: { type: "json_object" },
tools: allTools.map((t) => ({
type: "function",
function: t.spec,
})),
tool_choice: match(options.tools)
.ONLY((fn): Model.Chat.Config["tool_choice"] | undefined => ({
type: "function" as const,
function: { name: fn.spec.name },
}))
._(() => undefined),
}),
functions: allTools,
functionCallConcurrency: options.functionCallConcurrency,
mode: "tools",
}),
}));
Unfortunately, the way which dexter merges the options from ChatModel with "AI Runner", leads to there being both functions + tools specified, which is what I believe leads to instances where Open AI is being sent double function specs for this kind of code above.
I suspect this happens near chatModel.run
in a call to const mergedParams = deepMerge(this.params, params);
which ends up duplicating the entries for the list of functions. Sometimes I also get an error that looks like several functions in this list without names (name = "", description = ""
). I wonder if it is related...
Trying to use dexter
with non-OpenAI LLMs hosted by OpenRouter and seeing a few issues.
This logs for every LLM call. Since there's no real standard here aside from OpenAI, it'd be nice to support this total_cost
format as an optional parameter that we fallback to internally and skip the console.warn
.
Will add more cases to this issue as I come across them.
Using "moduleResolution": "node"
in the consuming project, which is very common, seems to break dexter's typing exports as tsserver can't find them.
Example:
import { type Prompt } from '@dexaai/dexter/prompt` // this gives an error
This will probably surprise anyone who tries to use this package, since before 2 weeks ago, I had never used "moduleResolution": "node"
.
It would be nice to use tsc
only, but I wonder if we'd be better off using something like tsup
for output compatibility?
Hi there,
Congrats on the launch! The library looks very clean, and I like the interfaces. I would love to see some alternative stores implemented, and would be happy to contribute some myself.
Is there anything I/we (other contributors) should be aware of when thinking about submitting a PR like this?
Thanks again!
When a function calls arguments fail the Zod validation we add a user message with the validation error and retry. This doesn't work with tool calling and throws:
An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'.
Hey do you plan to provide support for other data stores? A free one that could be self hosted would be nice.
Currently sitting at ~17MB with 14MB coming from tiktoken
: https://pkg-size.dev/@dexaai%2Fdexter
See also dqbd/tiktoken#68
For comparison, here's langchain at ~36MB: https://pkg-size.dev/langchain but we should be a lot slimmer than this. Langchain's not even loading the full tiktoken
WASM lib; they're using the 6.6MB js-tiktoken
.
This issue may end up just being resolved by improving tiktoken
's WASM bundle size upstream, but I wanted to track it while it's top of mind for gptlint
.
@rileytomasek is there a reason it's still on v1? Happy to create a PR.
Certain functions like createAIFunction
are mangled in the production docs by nextra. I haven't been able to repro this issue locally.
For example, see https://dexter.dexa.ai/docs/functions/createAIFunction which gets incorrectly resolved as https://dexter.dexa.ai/docs/functions/createAiFunction leading to 404s. (note the difference in casing :sigh:)
This continues the work in #17 to make Dexter's core more immutable and easier to reason about.
Specifically, any of the objects passed to event handlers should be readonly to guarantee that event handlers don't have unintended side effects for future event handlers or internal functionality.
We may want to consider making other key objects like AbstractModel.params
deep readonly as well.
We should make Sentry an optional dependency and allow it to be passed into the model constructor as @brianbegy suggested here.
I am using an openopenAI project in which I apply a tool call in two variations:
In both cases, the same problem arises: the toolCall object does not contain arguments in the function. Consequently, when calling file access, an error occurs when attempting to parse json. And when calling image generation, the arguments should include a prompt, which is also not delivered.
Reproduction steps:
Create a function to retrieve information from a file and access it. As a result, an error occurs:
JSONError: Unexpected end of JSON input while parsing empty string
This is related to the fact that the function.arguments object is empty.
toolCall: {
index: 0,
id: 'call_UEn7arSct776zKLJVyMY4o7Q',
type: 'function',
function: { name: 'retrieval', arguments: '' }
}
Create a function for image generation and access it. As a result, an error occurs:
JSONError: Unexpected end of JSON input while parsing empty string
Because the same object with arguments will be empty.
After conducting an investigation, I discovered that dexter is not returning the arguments for function calls. It's possible that something has changed on openAI's end in recent weeks, making this functionality unavailable now?
What should I do? Where should I look?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.