Giter Site home page Giter Site logo

wkok / openai-clojure Goto Github PK

View Code? Open in Web Editor NEW
201.0 12.0 28.0 559 KB

Clojure functions to drive the OpenAI API

Home Page: https://cljdoc.org/d/net.clojars.wkok/openai-clojure

License: MIT License

Clojure 100.00%
clojure gpt-3 openai chatgpt

openai-clojure's Introduction

openai-clojure (Unofficial)

Clojars Project CI

Clojure functions to drive the OpenAI API and Azure OpenAI API

This unofficial library aims to hide the small differences between the 2 APIs, and allows therefore to develop tools and applications which can work with both variants.

Documentation

cljdoc badge

Supported APIs

OpenAI Azure OpenAI
Version v2.3.0 v2024-06-01
Chat X X
Audio X
Completion X X
Embeddings X X
Models X
Images X
Files X
Fine-tuning X
Moderations X
Assistants (beta)
Threads (beta)
Messages (beta)
Runs (beta)

Configuration

Clojars Project

Add the openai-clojure dependency

deps.edn

net.clojars.wkok/openai-clojure {:mvn/version "0.19.0"}

Leiningen project.clj

[net.clojars.wkok/openai-clojure "0.19.0"]

Java

Minimum Java 11 required

Authentication

OpenAI

API Key

Set the environment variable OPENAI_API_KEY to your OpenAI API key.

(For alternative options to pass the API Key see options)

An API key can be generated in your OpenAI account

Organization

Optional - If your OpenAI account uses multiple organizations, set the environment variable OPENAI_ORGANIZATION to the one used for your app.

Azure OpenAI

See: Authentication - Azure OpenAI

Quickstart

See the full API Reference for examples of all the supported OpenAI APIs.

Require the api namespace

(:require [wkok.openai-clojure.api :as api])

A simple chat conversation with OpenAI's ChatGPT could be:

(api/create-chat-completion {:model "gpt-3.5-turbo"
                             :messages [{:role "system" :content "You are a helpful assistant."}
                                        {:role "user" :content "Who won the world series in 2020?"}
                                        {:role "assistant" :content "The Los Angeles Dodgers won the World Series in 2020."}
                                        {:role "user" :content "Where was it played?"}]})

Result:

{:id "chatcmpl-6srOKLabYTpTRwRUQxjkcBxw3uf1H",
 :object "chat.completion",
 :created 1678532968,
 :model "gpt-3.5-turbo-0301",
 :usage {:prompt_tokens 56, :completion_tokens 19, :total_tokens 75},
 :choices
 [{:message
   {:role "assistant",
    :content
    "The 2020 World Series was played at Globe Life Field in Arlington, Texas."},
   :finish_reason "stop",
   :index 0}]}

Issues and features

Please feel free to raise issues on Github or send pull requests

Acknowledgements

This library uses Martian - An HTTP abstraction library

License

This is an unofficial library, it is not affiliated with nor endorsed by OpenAI

MIT License

Copyright (c) 2023 Werner Kok

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

openai-clojure's People

Contributors

behrica avatar damesek avatar jianghoy avatar john-shaffer avatar liquidz avatar logankilpatrick avatar wkok avatar zmedelis avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

openai-clojure's Issues

Azure API deprecation

As stated in https://learn.microsoft.com/en-us/azure/ai-services/openai/api-version-deprecation, MS will deprecate certain versions of the OpenAI inference API.

I was able to "patch" this for my use with a little hack, but you might want to update to

https://github.com/Azure/azure-rest-api-specs/blob/main/specification/cognitiveservices/data-plane/AzureOpenAI/inference/preview/2023-12-01-preview/inference.json

The hack (for others to use, with the inference file in your resources folder as AzureOpenAI/2023-12-01-preview.json) is

(ns azure.openai.patch)


(in-ns 'wkok.openai-clojure.azure)


(defn load-openai-spec []
  (json/decode (slurp (io/resource "AzureOpenAI/2023-12-01-preview.json")) keyword))

(def m
  (delay
    (patch-handler
      (martian/bootstrap-openapi "/openai"
                                 (load-openai-spec)
                                 (update
                                   martian-http/default-opts
                                   :interceptors
                                   #(-> (remove (comp #{martian-http/perform-request}) %)
                                        (concat [add-authentication-header
                                                 openai-interceptors/set-request-options
                                                 override-api-endpoint
                                                 sse/perform-sse-capable-request])))))))

(defn patch-params [params]
  {:api-version       "2023-12-01-preview"
   :deployment-id     (:model params)
   :martian.core/body (dissoc params :model)})

and then requiring azure.openai.patch at the right point in your code ;)

openapi.yaml update needed

There was an update on OAI openapi.yaml with the notable change specifying FunctionParameters

openai-clojure version

    ChatCompletionFunctions:
         ...
        parameters: {}

OAI latest version

    ChatCompletionFunctions:
        ...
        parameters:
          $ref: "#/components/schemas/FunctionParameters"

ClassCastException when passing in request option async?

Description

When passing request option async? as described in the documentation, I get a ClassCastException: class jdk.internal.net.http.common.MinimalFuture cannot be cast to class clojure.lang.Associative. This is traced back to the function response-for in core.

My guess is that there's some code that does something with the response and assumes that it is always a map/Associative when it isn't streaming. I use and like this library enough that I would be happy to fix if you have some initial direction for where the problem is.

Reproduction

I ran the following query:

> (api/create-chat-completion {:model "gpt-3.5-turbo"
                                                  :messages [{:content "PING" :role "user"}]
                                                  :temperature 0
                                                  }
                                                 {:api-key openai-token
                                                  :request {:async? true}})

And got the following stack trace:

Unhandled clojure.lang.ExceptionInfo
   Interceptor Exception: class
   jdk.internal.net.http.common.MinimalFuture cannot be cast to class
   clojure.lang.Associative
   (jdk.internal.net.http.common.MinimalFuture is in module
   java.net.http of loader 'platform'; clojure.lang.Associative is in
   unnamed module of loader 'app')
   {:execution-id #uuid "2cdd723b-2410-4795-ac27-e28dff70677e",
    :stage :leave,
    :interceptor :martian.hato/keywordize-headers,
    :type java.lang.ClassCastException,
    :exception #error {
    :cause "class jdk.internal.net.http.common.MinimalFuture cannot be cast to class clojure.lang.Associative (jdk.internal.net.http.common.MinimalFuture is in module java.net.http of loader 'platform'; clojure.lang.Associative is in unnamed module of loader 'app')"
    :via
    [{:type java.lang.ClassCastException
      :message "class jdk.internal.net.http.common.MinimalFuture cannot be cast to class clojure.lang.Associative (jdk.internal.net.http.common.MinimalFuture is in module java.net.http of loader 'platform'; clojure.lang.Associative is in unnamed module of loader 'app')"
      :at [clojure.lang.RT assoc "RT.java" 827]}]
    :trace
    [[clojure.lang.RT assoc "RT.java" 827]
     [clojure.core$assoc__5481 invokeStatic "core.clj" 193]
     [clojure.core$update_in$up__6922 invoke "core.clj" 6220]
     [clojure.core$update_in$up__6922 invoke "core.clj" 6219]
     [clojure.core$update_in invokeStatic "core.clj" 6221]
     [clojure.core$update_in doInvoke "core.clj" 6207]
     [clojure.lang.RestFn invoke "RestFn.java" 445]
     [martian.hato$fn__18256 invokeStatic "hato.clj" 42]
     [martian.hato$fn__18256 invoke "hato.clj" 41]
     [tripod.context$try_f invokeStatic "context.cljc" 32]
     [tripod.context$try_f invoke "context.cljc" 22]
     [tripod.context$leave_all_with_binding invokeStatic "context.cljc" 136]
     [tripod.context$leave_all_with_binding invoke "context.cljc" 120]
     [tripod.context$leave_all$fn__13547 invoke "context.cljc" 149]
     [clojure.lang.AFn applyToHelper "AFn.java" 152]
     [clojure.lang.AFn applyTo "AFn.java" 144]
     [clojure.core$apply invokeStatic "core.clj" 667]
     [clojure.core$with_bindings_STAR_ invokeStatic "core.clj" 1990]
     [clojure.core$with_bindings_STAR_ doInvoke "core.clj" 1990]
     [clojure.lang.RestFn invoke "RestFn.java" 425]
     [tripod.context$leave_all invokeStatic "context.cljc" 148]
     [tripod.context$leave_all invoke "context.cljc" 142]
     [tripod.context$execute invokeStatic "context.cljc" 199]
     [tripod.context$execute invoke "context.cljc" 198]
     [martian.core$response_for invokeStatic "core.cljc" 113]
     [martian.core$response_for invoke "core.cljc" 107]
     [wkok.openai_clojure.core$response_for invokeStatic "core.clj" 36]
     [wkok.openai_clojure.core$response_for invoke "core.clj" 16]
     [wkok.openai_clojure.api$create_chat_completion invokeStatic "api.clj" 84]
     [wkok.openai_clojure.api$create_chat_completion invoke "api.clj" 63]
     [server.conversation$eval37315 invokeStatic "form-init10524477006413499415.clj" 286]
     [server.conversation$eval37315 invoke "form-init10524477006413499415.clj" 286]
     [clojure.lang.Compiler eval "Compiler.java" 7194]
     [clojure.lang.Compiler eval "Compiler.java" 7149]
     [clojure.core$eval invokeStatic "core.clj" 3215]
     [clojure.core$eval invoke "core.clj" 3211]
     [nrepl.middleware.interruptible_eval$evaluate$fn__26957$fn__26958 invoke "interruptible_eval.clj" 87]
     [clojure.lang.AFn applyToHelper "AFn.java" 152]
     [clojure.lang.AFn applyTo "AFn.java" 144]
     [clojure.core$apply invokeStatic "core.clj" 667]
     [clojure.core$with_bindings_STAR_ invokeStatic "core.clj" 1990]
     [clojure.core$with_bindings_STAR_ doInvoke "core.clj" 1990]
     [clojure.lang.RestFn invoke "RestFn.java" 425]
     [nrepl.middleware.interruptible_eval$evaluate$fn__26957 invoke "interruptible_eval.clj" 87]
     [clojure.main$repl$read_eval_print__9206$fn__9209 invoke "main.clj" 437]
     [clojure.main$repl$read_eval_print__9206 invoke "main.clj" 437]
     [clojure.main$repl$fn__9215 invoke "main.clj" 458]
     [clojure.main$repl invokeStatic "main.clj" 458]
     [clojure.main$repl doInvoke "main.clj" 368]
     [clojure.lang.RestFn invoke "RestFn.java" 1523]
     [nrepl.middleware.interruptible_eval$evaluate invokeStatic "interruptible_eval.clj" 84]
     [nrepl.middleware.interruptible_eval$evaluate invoke "interruptible_eval.clj" 56]
     [nrepl.middleware.interruptible_eval$interruptible_eval$fn__26990$fn__26994 invoke "interruptible_eval.clj" 152]
     [clojure.lang.AFn run "AFn.java" 22]
     [nrepl.middleware.session$session_exec$main_loop__27060$fn__27064 invoke "session.clj" 218]
     [nrepl.middleware.session$session_exec$main_loop__27060 invoke "session.clj" 217]
     [clojure.lang.AFn run "AFn.java" 22]
     [java.lang.Thread run "Thread.java" 1583]]}}
              context.cljc:   12  tripod.context$exception__GT_ex_info/invokeStatic
              context.cljc:   11  tripod.context$exception__GT_ex_info/invoke
              context.cljc:   35  tripod.context$try_f/invokeStatic
              context.cljc:   22  tripod.context$try_f/invoke
              context.cljc:  136  tripod.context$leave_all_with_binding/invokeStatic
              context.cljc:  120  tripod.context$leave_all_with_binding/invoke
              context.cljc:  149  tripod.context$leave_all$fn__13547/invoke
                  AFn.java:  152  clojure.lang.AFn/applyToHelper
                  AFn.java:  144  clojure.lang.AFn/applyTo
                  core.clj:  667  clojure.core/apply
                  core.clj: 1990  clojure.core/with-bindings*
                  core.clj: 1990  clojure.core/with-bindings*
               RestFn.java:  425  clojure.lang.RestFn/invoke
              context.cljc:  148  tripod.context$leave_all/invokeStatic
              context.cljc:  142  tripod.context$leave_all/invoke
              context.cljc:  199  tripod.context$execute/invokeStatic
              context.cljc:  198  tripod.context$execute/invoke
                 core.cljc:  113  martian.core$response_for/invokeStatic
                 core.cljc:  107  martian.core$response_for/invoke
                  core.clj:   36  wkok.openai-clojure.core/response-for
                  core.clj:   16  wkok.openai-clojure.core/response-for
                   api.clj:   84  wkok.openai-clojure.api/create-chat-completion
                   api.clj:   63  wkok.openai-clojure.api/create-chat-completion
                      REPL:  286  server.conversation/eval37315
                      REPL:  286  server.conversation/eval37315
             Compiler.java: 7194  clojure.lang.Compiler/eval
             Compiler.java: 7149  clojure.lang.Compiler/eval
                  core.clj: 3215  clojure.core/eval
                  core.clj: 3211  clojure.core/eval
    interruptible_eval.clj:   87  nrepl.middleware.interruptible-eval/evaluate/fn/fn
                  AFn.java:  152  clojure.lang.AFn/applyToHelper
                  AFn.java:  144  clojure.lang.AFn/applyTo
                  core.clj:  667  clojure.core/apply
                  core.clj: 1990  clojure.core/with-bindings*
                  core.clj: 1990  clojure.core/with-bindings*
               RestFn.java:  425  clojure.lang.RestFn/invoke
    interruptible_eval.clj:   87  nrepl.middleware.interruptible-eval/evaluate/fn
                  main.clj:  437  clojure.main/repl/read-eval-print/fn
                  main.clj:  437  clojure.main/repl/read-eval-print
                  main.clj:  458  clojure.main/repl/fn
                  main.clj:  458  clojure.main/repl
                  main.clj:  368  clojure.main/repl
               RestFn.java: 1523  clojure.lang.RestFn/invoke
    interruptible_eval.clj:   84  nrepl.middleware.interruptible-eval/evaluate
    interruptible_eval.clj:   56  nrepl.middleware.interruptible-eval/evaluate
    interruptible_eval.clj:  152  nrepl.middleware.interruptible-eval/interruptible-eval/fn/fn
                  AFn.java:   22  clojure.lang.AFn/run
               session.clj:  218  nrepl.middleware.session/session-exec/main-loop/fn
               session.clj:  217  nrepl.middleware.session/session-exec/main-loop
                  AFn.java:   22  clojure.lang.AFn/run
               Thread.java: 1583  java.lang.Thread/run

1. Caused by java.lang.ClassCastException
   class jdk.internal.net.http.common.MinimalFuture cannot be cast to
   class clojure.lang.Associative
   (jdk.internal.net.http.common.MinimalFuture is in module
   java.net.http of loader 'platform'; clojure.lang.Associative is in
   unnamed module of loader 'app')

                   RT.java:  827  clojure.lang.RT/assoc
                  core.clj:  193  clojure.core/assoc
                  core.clj: 6220  clojure.core/update-in/up
                  core.clj: 6219  clojure.core/update-in/up
                  core.clj: 6221  clojure.core/update-in
                  core.clj: 6207  clojure.core/update-in
               RestFn.java:  445  clojure.lang.RestFn/invoke
                  hato.clj:   42  martian.hato/fn
                  hato.clj:   41  martian.hato/fn
              context.cljc:   32  tripod.context$try_f/invokeStatic
              context.cljc:   22  tripod.context$try_f/invoke
              context.cljc:  136  tripod.context$leave_all_with_binding/invokeStatic
              context.cljc:  120  tripod.context$leave_all_with_binding/invoke
              context.cljc:  149  tripod.context$leave_all$fn__13547/invoke
                  AFn.java:  152  clojure.lang.AFn/applyToHelper
                  AFn.java:  144  clojure.lang.AFn/applyTo
                  core.clj:  667  clojure.core/apply
                  core.clj: 1990  clojure.core/with-bindings*
                  core.clj: 1990  clojure.core/with-bindings*
               RestFn.java:  425  clojure.lang.RestFn/invoke
              context.cljc:  148  tripod.context$leave_all/invokeStatic
              context.cljc:  142  tripod.context$leave_all/invoke
              context.cljc:  199  tripod.context$execute/invokeStatic
              context.cljc:  198  tripod.context$execute/invoke
                 core.cljc:  113  martian.core$response_for/invokeStatic
                 core.cljc:  107  martian.core$response_for/invoke
                  core.clj:   36  wkok.openai-clojure.core/response-for
                  core.clj:   16  wkok.openai-clojure.core/response-for
                   api.clj:   84  wkok.openai-clojure.api/create-chat-completion
                   api.clj:   63  wkok.openai-clojure.api/create-chat-completion
                      REPL:  286  server.conversation/eval37315
                      REPL:  286  server.conversation/eval37315
             Compiler.java: 7194  clojure.lang.Compiler/eval
             Compiler.java: 7149  clojure.lang.Compiler/eval
                  core.clj: 3215  clojure.core/eval
                  core.clj: 3211  clojure.core/eval
    interruptible_eval.clj:   87  nrepl.middleware.interruptible-eval/evaluate/fn/fn
                  AFn.java:  152  clojure.lang.AFn/applyToHelper
                  AFn.java:  144  clojure.lang.AFn/applyTo
                  core.clj:  667  clojure.core/apply
                  core.clj: 1990  clojure.core/with-bindings*
                  core.clj: 1990  clojure.core/with-bindings*
               RestFn.java:  425  clojure.lang.RestFn/invoke
    interruptible_eval.clj:   87  nrepl.middleware.interruptible-eval/evaluate/fn
                  main.clj:  437  clojure.main/repl/read-eval-print/fn
                  main.clj:  437  clojure.main/repl/read-eval-print
                  main.clj:  458  clojure.main/repl/fn
                  main.clj:  458  clojure.main/repl
                  main.clj:  368  clojure.main/repl
               RestFn.java: 1523  clojure.lang.RestFn/invoke
    interruptible_eval.clj:   84  nrepl.middleware.interruptible-eval/evaluate
    interruptible_eval.clj:   56  nrepl.middleware.interruptible-eval/evaluate
    interruptible_eval.clj:  152  nrepl.middleware.interruptible-eval/interruptible-eval/fn/fn
                  AFn.java:   22  clojure.lang.AFn/run
               session.clj:  218  nrepl.middleware.session/session-exec/main-loop/fn
               session.clj:  217  nrepl.middleware.session/session-exec/main-loop
                  AFn.java:   22  clojure.lang.AFn/run
               Thread.java: 1583  java.lang.Thread/run

SSE?

OpenAI offers SSE to stream results as they come generate. Would be great if this library supported that

jvm8 support

Execution error (ClassNotFoundException) at java.net.URLClassLoader/findClass (URLClassLoader.java:387).
java.net.http.HttpClient$Redirect

Seems like it's not available in Java8 ๐Ÿค”

Warning about no matching content-type

Hi,

When I call any OpenAI function for the first time I see the following warning printed:

No matching content-type available {:supported-content-types #{application/json application/transit+msgpack application/transit+json application/edn}, :available-content-types (application/octet-stream), :header Content-Type}

But everything seems to work normally and the warning isn't printed a second time.

Issue with "tools" parameter

I think this may be related to or similarly dealt with: #37

The parameters for tools should be "Any", to workaround the same issue, I think?

Thanks for your work on this library.

To reproduce provide a function using the tools parameter. Here's the same example modified from the issue to use tools:

(create-chat-completion {:model       "gpt-3.5-turbo"
                         :messages    [{:role    "user"
                                        :content "Wikipedia page about foxes"}]
                         :tools
                         [{:type     "function"
                           :function {:name        "get_current_weather"
                                      :description "Get the current weather in a given location"
                                      :parameters
                                      {:type       "object"
                                       :properties {:location {:type        "string"
                                                               :description "The city and state, e.g. San Francisco, CA"}
                                                    :unit     {:type "string"
                                                               :enum ["celsius" "fahrenheit"]}}}}}]
                         :tool_choice "auto"})

Which throws an exception:
"Execution error (ExceptionInfo) at schema-tools.coerce/coerce-or-error! (coerce.cljc:24).
Could not coerce value to schema: {:body {:tools [{:function {:parameters {:type disallowed-key, :properties disallowed-key}}}]}}"

{:type :schema-tools.coerce/error,
    :schema
    {:body
     {{:k :function_call} Any,
      {:k :stop} Any,
      {:k :n} (maybe (default Int 1)),
      {:k :presence_penalty} (maybe (default Num 0)),
      {:k :seed} (maybe Int),
      {:k :stream} (maybe (default Bool false)),
      {:k :temperature} (maybe (default Num 1)),
      {:k :functions}
      [{{:k :description} java.lang.String,
        :name java.lang.String,
        :parameters Any}],
      {:k :max_tokens} (maybe (default Int 16384)),
      {:k :response_format}
      {{:k :type} (default (enum "json_object" "text") "text")},
      :messages [Any],
      {:k :tools}
      [{:type (enum "function"),
        :function
        {{:k :description} java.lang.String,
         :name java.lang.String,
         :parameters {}}}],
      {:k :tool_choice} Any,
      {:k :frequency_penalty} (maybe (default Num 0)),
      {:k :logit_bias} (maybe {}),
      {:k :top_p} (maybe (default Num 1)),
      {:k :user} java.lang.String,
      :model Any}},
    :value
    {:body
     {:model "gpt-3.5-turbo",
      :messages [{:role "user", :content "Wikipedia page about foxes"}],
      :tools
      [{:type "function",
        :function
        {:name "get_current_weather",
         :description "Get the current weather in a given location",
         :parameters
         {:type "object",
          :properties {:location {#, #}, :unit {#, #}}}}}],
      :tool_choice "auto",
      :wkok.openai-clojure.core/options nil}},
    :error
    {:body
     {:tools
      [{:function
        {:parameters
         {:type disallowed-key, :properties disallowed-key}}}]}}}

Is it possible to use the "functions" feature?

I tried the following, but it didnt seem to work, its adapted from https://platform.openai.com/docs/guides/gpt/function-calling

(api/create-chat-completion {:model "gpt-3.5-turbo-0613"
:messages [
{:role "user" :content "Whats the weather like in Boston?"}]
:functions
[ {
:name "get_current_weather"
:description "Get the current weather in a given location"
:parameters {
:type "object",
:properties {
:location {
:type "string",
:description "The city and state, e.g. San Francisco, CA"
}
:unit {:type "string" :enum ["celsius", "fahrenheit"]}
}
:required ["location"]
}
}]
:function-call "auto"
}
{:api-key KEY})

Support for Speech API

Thank you for this great library & being active in maintaining it.

I see currently it does not support the speech API, and my biggest need: Streaming Speech API

Is this planned or development in progress? If not, I would love to contribute with a PR

Let me know how I should proceed, thank you!

A way to get the raw response?

Hi, nice library โ€” thank you!

Iโ€™d like to store the raw responses in my database, exactly as they were returned from the OpenAI API.

Is there a way to access them?

If not, please consider this a feature request.

Thank you!

Warnings printed to console on first invocation

Thank you for this library! I'm noticing that there's a flurry of warning messages that are printed to the console when we invoke create-completion for the first time. Is this expected? User error?

(require '[wkok.openai-clojure.api :as openai])
(openai/create-completion
 {:model "text-davinci-003"
  :prompt "Here are some Pokemon: Bulbasaur, "
  :max_tokens 200}
 {:api-key api-key})
No matching content-type available {:supported-content-types #{application/json application/transit+msgpack application/transit+json application/edn}, :available-content-types (multipart/form-data), :header Accept}
No matching content-type available {:supported-content-types #{application/json application/transit+msgpack application/transit+json application/edn}, :available-content-types (multipart/form-data), :header Accept}
No matching content-type available {:supported-content-types #{application/json application/transit+msgpack application/transit+json application/edn}, :available-content-types (multipart/form-data), :header Accept}
No matching content-type available {:supported-content-types #{application/json application/transit+msgpack application/transit+json application/edn}, :available-content-types (multipart/form-data), :header Accept}
No matching content-type available {:supported-content-types #{application/json application/transit+msgpack application/transit+json application/edn}, :available-content-types (multipart/form-data), :header Accept}

[Support] Allow nil to be a value in parameters

I'm unsure whether this report should be included in this library, but I want to mention that the error below occurs when the content is nil during function calls.

; Execution error (ExceptionInfo) at schema-tools.coerce/coerce-or-error! (coerce.cljc:24).
; Could not coerce the value to schema: {:body {:messages [nil nil nil nil nil nil {:content (not (instance? java.lang.String nil))} nil]}}

This issue can be reproduced when using function calls as a response of create-chat-completion. The response value appears as {:role "assistant", :content nil, :function_call {:name "some-function", :arguments "{\n \"name\": \"someName\"\n}"}}), but it's an invalid schema in the API. I may have a workaround, but I'd like to inquire if supporting nil as a value is allowed.

Reference: https://platform.openai.com/docs/api-reference/chat/create,

content string Optional
The contents of the message. content is required for all messages except assistant messages with function calls.

unable to find valid certification path to requested target

Error printing return value (SunCertPathBuilderException) at sun.security.provider.certpath.SunCertPathBuilder/build (SunCertPathBuilder.java:141).
unable to find valid certification path to requested target

Using JVM 11 and version 0.9.0 I can't send any request anymore

on-next not called when stream option is true

I am calling create-chat-completion with the options :stream true :on-next #(log/info :on-next %) and nothing it logs, instead I have all the "chat.completion.chunk" events dumped at once after some time

I am using azure

Spec validation errors

We use spec instrumentation to check calls (I use orchestra but it happens with clojure.spec.test.alpha/instrument also), and experience failures when getting embeddings. Calls fail in martian.core/build-instance, an upgrade to martian 0.1.23 doesn't solve the issue - it could be an upstream problem.

The workaround is to disable instrumentation, but that means for the entire codebase, and it's really useful so that would be a shame.

Sorry for the long stacktrace, it looks like the args include a bunch of API info.

The spec error looks like an issue with a :consumes parameter, which is supposed to be nil? but appears to be a vector containing nil, as in [nil]

    wkok.openai-clojure.api/create-embedding                  api.clj:  183
    wkok.openai-clojure.api/create-embedding                  api.clj:  188
        wkok.openai-clojure.core/response-for                 core.clj:   26
                           clojure.core/deref                 core.clj: 2337
...
                  wkok.openai-clojure.openai/fn               openai.clj:   81
   wkok.openai-clojure.openai/bootstrap-openapi               openai.clj:   78
...
                martian.core/bootstrap-openapi                core.cljc:  128
                martian.core/bootstrap-openapi                core.cljc:  132
...
       orchestra.spec.test/spec-checking-fn/fn                test.cljc:   36
 orchestra.spec.test/spec-checking-fn/conform!                test.cljc:   30
clojure.lang.ExceptionInfo: Call to martian.core/build-instance did not conform to spec.
clojure.spec.alpha/args: ("https://api.openai.com/v1"
                           ({:description nil,
                             :method :get,
                             :produces ["application/json"],
                             :path-schema {:model java.lang.String},
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {},
                              :query-schema {},
                              :body-schema {},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/models/" :model],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "retrieveModel",
                              :tags ["OpenAI"],
                              :summary
                              "Retrieves a model instance, providing basic information about the model such as the owner and permissioning.",
                              :parameters
                              [{:in "path",
                                :name "model",
                                :required true,
                                :schema {:type "string", :example "text-davinci-001"},
                                :description "The ID of the model to use for this request"}],
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema {:$ref "#/components/schemas/Model"}}}}},
                              :x-oaiMeta
                              {:name "Retrieve model",
                               :group "models",
                               :path "retrieve",
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/models/VAR_model_id \\\n  -H 'Authorization: Bearer YOUR_API_KEY'\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\nopenai.Model.retrieve(\"VAR_model_id\")\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.retrieveModel(\"VAR_model_id\");\n"},
                               :response
                               "{\n  \"id\": \"VAR_model_id\",\n  \"object\": \"model\",\n  \"owned_by\": \"openai\",\n  \"permission\": [...]\n}\n"}},
                             ...}
                            {:description nil,
                             :method :delete,
                             :produces ["application/json"],
                             :path-schema {:model java.lang.String},
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {},
                              :query-schema {},
                              :body-schema {},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/models/" :model],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "deleteModel",
                              :tags ["OpenAI"],
                              :summary
                              "Delete a fine-tuned model. You must have the Owner role in your organization.",
                              :parameters
                              [{:in "path",
                                :name "model",
                                :required true,
                                :schema
                                {:type "string", :example "curie:ft-acmeco-2021-03-03-21-44-20"},
                                :description "The model to delete"}],
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema
                                               {:$ref
                                                "#/components/schemas/DeleteModelResponse"}}}}},
                              :x-oaiMeta
                              {:name "Delete fine-tune model",
                               :group "fine-tunes",
                               :path "delete-model",
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/models/curie:ft-acmeco-2021-03-03-21-44-20 \\\n  -X DELETE \\\n  -H \"Authorization: Bearer YOUR_API_KEY\"\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\nopenai.Model.delete(\"curie:ft-acmeco-2021-03-03-21-44-20\")\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.deleteModel('curie:ft-acmeco-2021-03-03-21-44-20');\n"},
                               :response
                               "{\n  \"id\": \"curie:ft-acmeco-2021-03-03-21-44-20\",\n  \"object\": \"model\",\n  \"deleted\": true\n}\n"}},
                             ...}
                            {:description nil,
                             :method :get,
                             :produces ["application/json"],
                             :path-schema nil,
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {},
                              :query-schema {},
                              :body-schema {},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/engines"],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "listEngines",
                              :deprecated true,
                              :tags ["OpenAI"],
                              :summary
                              "Lists the currently available (non-finetuned) models, and provides basic information about each one such as the owner and availability.",
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema
                                               {:$ref
                                                "#/components/schemas/ListEnginesResponse"}}}}},
                              :x-oaiMeta
                              {:name "List engines",
                               :group "engines",
                               :path "list",
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/engines \\\n  -H 'Authorization: Bearer YOUR_API_KEY'\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\nopenai.Engine.list()\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.listEngines();\n"},
                               :response
                               "{\n  \"data\": [\n    {\n      \"id\": \"engine-id-0\",\n      \"object\": \"engine\",\n      \"owner\": \"organization-owner\",\n      \"ready\": true\n    },\n    {\n      \"id\": \"engine-id-2\",\n      \"object\": \"engine\",\n      \"owner\": \"organization-owner\",\n      \"ready\": true\n    },\n    {\n      \"id\": \"engine-id-3\",\n      \"object\": \"engine\",\n      \"owner\": \"openai\",\n      \"ready\": false\n    },\n  ],\n  \"object\": \"list\"\n}\n"}},
                             ...}
                            {:description nil,
                             :method :post,
                             :produces ["application/json"],
                             :path-schema nil,
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {},
                              :query-schema {},
                              :body-schema
                              {[:body]
                               {:return-metadata :return_metadata,
                                :search-model :search_model,
                                :return-prompt :return_prompt,
                                :logit-bias :logit_bias,
                                :max-examples :max_examples}},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/classifications"],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "createClassification",
                              :deprecated true,
                              :tags ["OpenAI"],
                              :summary
                              "Classifies the specified `query` using provided examples.\n\nThe endpoint first [searches](/docs/api-reference/searches) over the labeled examples\nto select the ones most relevant for the particular query. Then, the relevant examples\nare combined with the query to construct a prompt to produce the final label via the\n[completions](/docs/api-reference/completions) endpoint.\n\nLabeled examples can be provided via an uploaded `file`, or explicitly listed in the\nrequest using the `examples` parameter for quick tests and small scale use cases.\n",
                              :requestBody
                              {:required true,
                               :content
                               #:application{:json
                                             {:schema
                                              {:$ref
                                               "#/components/schemas/CreateClassificationRequest"}}}},
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema
                                               {:$ref
                                                "#/components/schemas/CreateClassificationResponse"}}}}},
                              :x-oaiMeta
                              {:name "Create classification",
                               :group "classifications",
                               :path "create",
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/classifications \\\n  -X POST \\\n  -H \"Authorization: Bearer YOUR_API_KEY\" \\\n  -H 'Content-Type: application/json' \\\n  -d '{\n    \"examples\": [\n      [\"A happy moment\", \"Positive\"],\n      [\"I am sad.\", \"Negative\"],\n      [\"I am feeling awesome\", \"Positive\"]],\n    \"query\": \"It is a raining day :(\",\n    \"search_model\": \"ada\",\n    \"model\": \"curie\",\n    \"labels\":[\"Positive\", \"Negative\", \"Neutral\"]\n  }'\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\nopenai.Classification.create(\n  search_model=\"ada\",\n  model=\"curie\",\n  examples=[\n    [\"A happy moment\", \"Positive\"],\n    [\"I am sad.\", \"Negative\"],\n    [\"I am feeling awesome\", \"Positive\"]\n  ],\n  query=\"It is a raining day :(\",\n  labels=[\"Positive\", \"Negative\", \"Neutral\"],\n)\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.createClassification({\n  search_model: \"ada\",\n  model: \"curie\",\n  examples: [\n    [\"A happy moment\", \"Positive\"],\n    [\"I am sad.\", \"Negative\"],\n    [\"I am feeling awesome\", \"Positive\"]\n  ],\n  query:\"It is a raining day :(\",\n  labels: [\"Positive\", \"Negative\", \"Neutral\"],\n});\n"},
                               :parameters
                               "{\n  \"examples\": [\n    [\"A happy moment\", \"Positive\"],\n    [\"I am sad.\", \"Negative\"],\n    [\"I am feeling awesome\", \"Positive\"]\n  ],\n  \"labels\": [\"Positive\", \"Negative\", \"Neutral\"],\n  \"query\": \"It is a raining day :(\",\n  \"search_model\": \"ada\",\n  \"model\": \"curie\"\n}\n",
                               :response
                               "{\n  \"completion\": \"cmpl-2euN7lUVZ0d4RKbQqRV79IiiE6M1f\",\n  \"label\": \"Negative\",\n  \"model\": \"curie:2020-05-03\",\n  \"object\": \"classification\",\n  \"search_model\": \"ada\",\n  \"selected_examples\": [\n    {\n      \"document\": 1,\n      \"label\": \"Negative\",\n      \"text\": \"I am sad.\"\n    },\n    {\n      \"document\": 0,\n      \"label\": \"Positive\",\n      \"text\": \"A happy moment\"\n    },\n    {\n      \"document\": 2,\n      \"label\": \"Positive\",\n      \"text\": \"I am feeling awesome\"\n    }\n  ]\n}\n"}},
                             ...}
                            {:description nil,
                             :method :post,
                             :produces ["application/json"],
                             :path-schema {:engine_id java.lang.String},
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {[] {:engine-id :engine_id}},
                              :query-schema {},
                              :body-schema
                              {[:body]
                               {:max-rerank :max_rerank, :return-metadata :return_metadata}},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/engines/" :engine_id "/search"],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "createSearch",
                              :deprecated true,
                              :tags ["OpenAI"],
                              :summary
                              "The search endpoint computes similarity scores between provided query and documents. Documents can be passed directly to the API if there are no more than 200 of them.\n\nTo go beyond the 200 document limit, documents can be processed offline and then used for efficient retrieval at query time. When `file` is set, the search endpoint searches over all the documents in the given file and returns up to the `max_rerank` number of documents. These documents will be returned along with their search scores.\n\nThe similarity score is a positive score that usually ranges from 0 to 300 (but can sometimes go higher), where a score above 200 usually means the document is semantically similar to the query.\n",
                              :parameters
                              [{:in "path",
                                :name "engine_id",
                                :required true,
                                :schema {:type "string", :example "davinci"},
                                :description
                                "The ID of the engine to use for this request.  You can select one of `ada`, `babbage`, `curie`, or `davinci`."}],
                              :requestBody
                              {:required true,
                               :content
                               #:application{:json
                                             {:schema
                                              {:$ref
                                               "#/components/schemas/CreateSearchRequest"}}}},
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema
                                               {:$ref
                                                "#/components/schemas/CreateSearchResponse"}}}}},
                              :x-oaiMeta
                              {:name "Create search",
                               :group "searches",
                               :path "create",
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/engines/davinci/search \\\n  -H \"Content-Type: application/json\" \\\n  -H 'Authorization: Bearer YOUR_API_KEY' \\\n  -d '{\n  \"documents\": [\"White House\", \"hospital\", \"school\"],\n  \"query\": \"the president\"\n}'\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\nopenai.Engine(\"davinci\").search(\n  documents=[\"White House\", \"hospital\", \"school\"],\n  query=\"the president\"\n)\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.createSearch(\"davinci\", {\n  documents: [\"White House\", \"hospital\", \"school\"],\n  query: \"the president\",\n});\n"},
                               :parameters
                               "{\n  \"documents\": [\n    \"White House\",\n    \"hospital\",\n    \"school\"\n  ],\n  \"query\": \"the president\"\n}\n",
                               :response
                               "{\n  \"data\": [\n    {\n      \"document\": 0,\n      \"object\": \"search_result\",\n      \"score\": 215.412\n    },\n    {\n      \"document\": 1,\n      \"object\": \"search_result\",\n      \"score\": 40.316\n    },\n    {\n      \"document\": 2,\n      \"object\": \"search_result\",\n      \"score\":  55.226\n    }\n  ],\n  \"object\": \"list\"\n}\n"}},
                             ...}
                            {:description nil,
                             :method :post,
                             :produces ["application/json"],
                             :path-schema nil,
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {},
                              :query-schema {},
                              :body-schema {[:body] {:response-format :response_format}},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/images/variations"],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "createImageVariation",
                              :tags ["OpenAI"],
                              :summary "Creates a variation of a given image.",
                              :requestBody
                              {:required true,
                               :content
                               #:multipart{:form-data
                                           {:schema
                                            {:$ref
                                             "#/components/schemas/CreateImageVariationRequest"}}}},
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema
                                               {:$ref "#/components/schemas/ImagesResponse"}}}}},
                              :x-oaiMeta
                              {:name "Create image variation",
                               :group "images",
                               :path "create-variation",
                               :beta true,
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/images/variations \\\n  -H 'Authorization: Bearer YOUR_API_KEY' \\\n  -F image='@otter.png' \\\n  -F n=2 \\\n  -F size=\"1024x1024\"\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\nopenai.Image.create_variation(\n  image=open(\"otter.png\", \"rb\"),\n  n=2,\n  size=\"1024x1024\"\n)\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.createImageVariation(\n  fs.createReadStream(\"otter.png\"),\n  2,\n  \"1024x1024\"\n);\n"},
                               :response
                               "{\n  \"created\": 1589478378,\n  \"data\": [\n    {\n      \"url\": \"https://...\"\n    },\n    {\n      \"url\": \"https://...\"\n    }\n  ]\n}\n"}},
                             ...}
                            {:description nil,
                             :method :get,
                             :produces ["application/json"],
                             :path-schema {:fine_tune_id java.lang.String},
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {[] {:fine-tune-id :fine_tune_id}},
                              :query-schema {},
                              :body-schema {},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/fine-tunes/" :fine_tune_id],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "retrieveFineTune",
                              :tags ["OpenAI"],
                              :summary
                              "Gets info about the fine-tune job.\n\n[Learn more about Fine-tuning](/docs/guides/fine-tuning)\n",
                              :parameters
                              [{:in "path",
                                :name "fine_tune_id",
                                :required true,
                                :schema {:type "string", :example "ft-AF1WoRqd3aJAHsqc9NY7iL8F"},
                                :description "The ID of the fine-tune job\n"}],
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema
                                               {:$ref "#/components/schemas/FineTune"}}}}},
                              :x-oaiMeta
                              {:name "Retrieve fine-tune",
                               :group "fine-tunes",
                               :path "retrieve",
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/fine-tunes/ft-AF1WoRqd3aJAHsqc9NY7iL8F \\\n  -H \"Authorization: Bearer YOUR_API_KEY\"\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\nopenai.FineTune.retrieve(id=\"ft-AF1WoRqd3aJAHsqc9NY7iL8F\")\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.retrieveFineTune(\"ft-AF1WoRqd3aJAHsqc9NY7iL8F\");\n"},
                               :response
                               "{\n  \"id\": \"ft-AF1WoRqd3aJAHsqc9NY7iL8F\",\n  \"object\": \"fine-tune\",\n  \"model\": \"curie\",\n  \"created_at\": 1614807352,\n  \"events\": [\n    {\n      \"object\": \"fine-tune-event\",\n      \"created_at\": 1614807352,\n      \"level\": \"info\",\n      \"message\": \"Job enqueued. Waiting for jobs ahead to complete. Queue number: 0.\"\n    },\n    {\n      \"object\": \"fine-tune-event\",\n      \"created_at\": 1614807356,\n      \"level\": \"info\",\n      \"message\": \"Job started.\"\n    },\n    {\n      \"object\": \"fine-tune-event\",\n      \"created_at\": 1614807861,\n      \"level\": \"info\",\n      \"message\": \"Uploaded snapshot: curie:ft-acmeco-2021-03-03-21-44-20.\"\n    },\n    {\n      \"object\": \"fine-tune-event\",\n      \"created_at\": 1614807864,\n      \"level\": \"info\",\n      \"message\": \"Uploaded result files: file-QQm6ZpqdNwAaVC3aSz5sWwLT.\"\n    },\n    {\n      \"object\": \"fine-tune-event\",\n      \"created_at\": 1614807864,\n      \"level\": \"info\",\n      \"message\": \"Job succeeded.\"\n    }\n  ],\n  \"fine_tuned_model\": \"curie:ft-acmeco-2021-03-03-21-44-20\",\n  \"hyperparams\": {\n    \"batch_size\": 4,\n    \"learning_rate_multiplier\": 0.1,\n    \"n_epochs\": 4,\n    \"prompt_loss_weight\": 0.1,\n  },\n  \"organization_id\": \"org-...\",\n  \"result_files\": [\n    {\n      \"id\": \"file-QQm6ZpqdNwAaVC3aSz5sWwLT\",\n      \"object\": \"file\",\n      \"bytes\": 81509,\n      \"created_at\": 1614807863,\n      \"filename\": \"compiled_results.csv\",\n      \"purpose\": \"fine-tune-results\"\n    }\n  ],\n  \"status\": \"succeeded\",\n  \"validation_files\": [],\n  \"training_files\": [\n    {\n      \"id\": \"file-XGinujblHPwGLSztz8cPS8XY\",\n      \"object\": \"file\",\n      \"bytes\": 1547276,\n      \"created_at\": 1610062281,\n      \"filename\": \"my-data-train.jsonl\",\n      \"purpose\": \"fine-tune-train\"\n    }\n  ],\n  \"updated_at\": 1614807865,\n}\n"}},
                             ...}
                            {:description nil,
                             :method :post,
                             :produces ["application/json"],
                             :path-schema nil,
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {},
                              :query-schema {},
                              :body-schema {[:body] {:response-format :response_format}},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/images/generations"],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "createImage",
                              :tags ["OpenAI"],
                              :summary "Creates an image given a prompt.",
                              :requestBody
                              {:required true,
                               :content
                               #:application{:json
                                             {:schema
                                              {:$ref
                                               "#/components/schemas/CreateImageRequest"}}}},
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema
                                               {:$ref "#/components/schemas/ImagesResponse"}}}}},
                              :x-oaiMeta
                              {:name "Create image",
                               :group "images",
                               :path "create",
                               :beta true,
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/images/generations \\\n  -H 'Content-Type: application/json' \\\n  -H 'Authorization: Bearer YOUR_API_KEY' \\\n  -d '{\n  \"prompt\": \"A cute baby sea otter\",\n  \"n\": 2,\n  \"size\": \"1024x1024\"\n}'\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\nopenai.Image.create(\n  prompt=\"A cute baby sea otter\",\n  n=2,\n  size=\"1024x1024\"\n)\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.createImage({\n  prompt: \"A cute baby sea otter\",\n  n: 2,\n  size: \"1024x1024\",\n});\n"},
                               :parameters
                               "{\n  \"prompt\": \"A cute baby sea otter\",\n  \"n\": 2,\n  \"size\": \"1024x1024\"\n}\n",
                               :response
                               "{\n  \"created\": 1589478378,\n  \"data\": [\n    {\n      \"url\": \"https://...\"\n    },\n    {\n      \"url\": \"https://...\"\n    }\n  ]\n}\n"}},
                             ...}
                            {:description nil,
                             :method :post,
                             :produces ["application/json"],
                             :path-schema nil,
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {},
                              :query-schema {},
                              :body-schema
                              {[:body]
                               {:presence-penalty :presence_penalty,
                                :max-tokens :max_tokens,
                                :frequency-penalty :frequency_penalty,
                                :logit-bias :logit_bias,
                                :top-p :top_p}},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/chat/completions"],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "createChatCompletion",
                              :tags ["OpenAI"],
                              :summary "Creates a completion for the chat message",
                              :requestBody
                              {:required true,
                               :content
                               #:application{:json
                                             {:schema
                                              {:$ref
                                               "#/components/schemas/CreateChatCompletionRequest"}}}},
                              :responses
                              {:200
                               {:description "OK",
                                :content
                                #:application{:json
                                              {:schema
                                               {:$ref
                                                "#/components/schemas/CreateChatCompletionResponse"}}}}},
                              :x-oaiMeta
                              {:name "Create chat completion",
                               :group "chat",
                               :path "create",
                               :beta true,
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/chat/completions \\\n  -H 'Content-Type: application/json' \\\n  -H 'Authorization: Bearer YOUR_API_KEY' \\\n  -d '{\n  \"model\": \"gpt-3.5-turbo\",\n  \"messages\": [{\"role\": \"user\", \"content\": \"Hello!\"}]\n}'\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\n\ncompletion = openai.ChatCompletion.create(\n  model=\"gpt-3.5-turbo\",\n  messages=[\n    {\"role\": \"user\", \"content\": \"Hello!\"}\n  ]\n)\n\nprint(completion.choices[0].message)\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\n\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\n\nconst completion = await openai.createChatCompletion({\n  model: \"gpt-3.5-turbo\",\n  messages: [{role: \"user\", content: \"Hello world\"}],\n});\nconsole.log(completion.data.choices[0].message);\n"},
                               :parameters
                               "{\n  \"model\": \"gpt-3.5-turbo\",\n  \"messages\": [{\"role\": \"user\", \"content\": \"Hello!\"}]\n}\n",
                               :response
                               "{\n  \"id\": \"chatcmpl-123\",\n  \"object\": \"chat.completion\",\n  \"created\": 1677652288,\n  \"choices\": [{\n    \"index\": 0,\n    \"message\": {\n      \"role\": \"assistant\",\n      \"content\": \"\\n\\nHello there, how may I assist you today?\",\n    },\n    \"finish_reason\": \"stop\"\n  }],\n  \"usage\": {\n    \"prompt_tokens\": 9,\n    \"completion_tokens\": 12,\n    \"total_tokens\": 21\n  }\n}\n"}},
                             ...}
                            {:description nil,
                             :method :get,
                             :produces ["application/json"],
                             :path-schema {:file_id java.lang.String},
                             :query-schema nil,
                             :parameter-aliases
                             {:path-schema {[] {:file-id :file_id}},
                              :query-schema {},
                              :body-schema {},
                              :form-schema {},
                              :headers-schema {}},
                             :form-schema nil,
                             :path-parts ["/files/" :file_id "/content"],
                             :headers-schema nil,
                             :openapi-definition
                             {:operationId "downloadFile",
                              :tags ["OpenAI"],
                              :summary "Returns the contents of the specified file",
                              :parameters
                              [{:in "path",
                                :name "file_id",
                                :required true,
                                :schema {:type "string"},
                                :description "The ID of the file to use for this request"}],
                              :responses
                              {:200
                               {:description "OK",
                                :content #:application{:json {:schema {:type "string"}}}}},
                              :x-oaiMeta
                              {:name "Retrieve file content",
                               :group "files",
                               :path "retrieve-content",
                               :examples
                               {:curl
                                "curl https://api.openai.com/v1/files/file-XjGxS3KTG0uNmNOK362iJua3/content \\\n  -H 'Authorization: Bearer YOUR_API_KEY' > file.jsonl\n",
                                :python
                                "import os\nimport openai\nopenai.api_key = os.getenv(\"OPENAI_API_KEY\")\ncontent = openai.File.download(\"file-XjGxS3KTG0uNmNOK362iJua3\")\n",
                                :node.js
                                "const { Configuration, OpenAIApi } = require(\"openai\");\nconst configuration = new Configuration({\n  apiKey: process.env.OPENAI_API_KEY,\n});\nconst openai = new OpenAIApi(configuration);\nconst response = await openai.downloadFile(\"file-XjGxS3KTG0uNmNOK362iJua3\");\n"}}},
                             ...}
                            ...)
                           {:interceptors
                            ({:name :martian.interceptors/keywordize-params,
                              :enter
                              #object[martian.interceptors$fn__37838 0x17567a6d "martian.interceptors$fn__37838@17567a6d"]}
                             {:name :martian.interceptors/method,
                              :enter
                              #object[martian.interceptors$fn__37828 0x50856023 "martian.interceptors$fn__37828@50856023"]}
                             {:name :martian.interceptors/url,
                              :enter
                              #object[martian.interceptors$fn__37832 0x5e22586b "martian.interceptors$fn__37832@5e22586b"]}
                             {:name :martian.interceptors/query-params,
                              :enter
                              #object[martian.interceptors$fn__37841 0x4950bcbc "martian.interceptors$fn__37841@4950bcbc"]}
                             {:name :martian.interceptors/body-params,
                              :enter
                              #object[martian.interceptors$fn__37845 0x4bd51462 "martian.interceptors$fn__37845@4bd51462"]}
                             {:name :martian.interceptors/form-params,
                              :enter
                              #object[martian.interceptors$fn__37856 0x5365c557 "martian.interceptors$fn__37856@5365c557"]}
                             {:name :martian.interceptors/header-params,
                              :enter
                              #object[martian.interceptors$fn__37860 0x273f775d "martian.interceptors$fn__37860@273f775d"]}
                             {:name :martian.interceptors/enqueue-route-specific-interceptors,
                              :enter
                              #object[martian.interceptors$fn__37865 0x34d59ddc "martian.interceptors$fn__37865@34d59ddc"]}
                             {:name :martian.interceptors/encode-body,
                              :encodes
                              ("application/transit+msgpack"
                                "application/transit+json"
                                "application/edn"
                                "application/json"),
                              :enter
                              #object[martian.interceptors$encode_body$fn__37872 0x2fed08a "martian.interceptors$encode_body$fn__37872@2fed08a"]}
                             {:name :martian.interceptors/coerce-response,
                              :decodes
                              ("application/transit+msgpack"
                                "application/transit+json"
                                "application/edn"
                                "application/json"),
                              :enter
                              #object[martian.interceptors$coerce_response$fn__37881 0x39ae35dd "martian.interceptors$coerce_response$fn__37881@39ae35dd"],
                              :leave
                              #object[martian.interceptors$coerce_response$fn__37888 0x5e16b0ba "martian.interceptors$coerce_response$fn__37888@5e16b0ba"]}
                             ...)})
clojure.spec.alpha/failure: :instrument
clojure.spec.alpha/fn: martian.core/build-instance
clojure.spec.alpha/problems: ({:path [:handlers :consumes :clojure.spec.alpha/nil],
                               :pred nil?,
                               :val [nil],
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 0 :consumes]}
                              {:path [:handlers :consumes :clojure.spec.alpha/pred],
                               :pred string?,
                               :val nil,
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 0 :consumes 0]}
                              {:path [:handlers :consumes :clojure.spec.alpha/nil],
                               :pred nil?,
                               :val [nil],
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 1 :consumes]}
                              {:path [:handlers :consumes :clojure.spec.alpha/pred],
                               :pred string?,
                               :val nil,
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 1 :consumes 0]}
                              {:path [:handlers :consumes :clojure.spec.alpha/nil],
                               :pred nil?,
                               :val [nil],
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 2 :consumes]}
                              {:path [:handlers :consumes :clojure.spec.alpha/pred],
                               :pred string?,
                               :val nil,
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 2 :consumes 0]}
                              {:path [:handlers :consumes :clojure.spec.alpha/nil],
                               :pred nil?,
                               :val [nil],
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 6 :consumes]}
                              {:path [:handlers :consumes :clojure.spec.alpha/pred],
                               :pred string?,
                               :val nil,
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 6 :consumes 0]}
                              {:path [:handlers :consumes :clojure.spec.alpha/nil],
                               :pred nil?,
                               :val [nil],
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 9 :consumes]}
                              {:path [:handlers :consumes :clojure.spec.alpha/pred],
                               :pred string?,
                               :val nil,
                               :via [:martian.spec/handler :martian.spec/content-types],
                               :in [1 9 :consumes 0]}
                              ...)

400 for create-file

I've just upgraded from 0.10.0 to 0.11.0 to give fine-tuning a spin (hopefully gpt-4 fine-tuning will be announced on monday!)

But when running:

(wkok.openai-clojure.api/create-file
 {:purpose "fine-tune"
  :file (clojure.java.io/file "one.jsonl")}
 {:api-key openai-key})

I get a 400 for the options object.

ExceptionInfo
   status: 400
   {:request-time 876,
    :request
    {:user-info nil,
     :as :text,
     :headers
     {"Accept" "application/json",
      "Authorization"
      "Bearer ๐Ÿป ",
      "content-type"
      "multipart/form-data; boundary=hatoBoundaryQTa_33CmKkdIY26ZhtH98PDN6JM1t0",
      "accept-encoding" "gzip, deflate"},
     :server-port nil,
     :url "https://api.openai.com/v1/files",
     :http-request
     #object[jdk.internal.net.http.HttpRequestImpl 0x166de34c "https://api.openai.com/v1/files POST"],
     :uri "/v1/files",
     :server-name "api.openai.com",
     :version :http-1.1,
     :query-string nil,
     :body
     #object[java.io.PipedInputStream 0x41bf8c3c "java.io.PipedInputStream@41bf8c3c"],
     :scheme :https,
     :request-method :post},
    :http-client
    #object[jdk.internal.net.http.HttpClientFacade 0x3b2f4320 "jdk.internal.net.http.HttpClientImpl@5913bcdb(9)"],
    :headers
    {"openai-organization" "user-me",
     "server" "cloudflare",
     "content-type" "application/json",
     "access-control-allow-origin" "*",
     "content-length" "177",
     "alt-svc" "h3=\":443\"; ma=86400",
     "openai-version" "2020-10-01",
     "strict-transport-security" "max-age=15724800; includeSubDomains",
     "openai-processing-ms" "77",
     "connection" "keep-alive",
     "cf-cache-status" "DYNAMIC",
     "cf-ray" "820d386008c96718-AMS",
     "date" "Sat, 04 Nov 2023 13:31:57 GMT",
     "x-request-id" "93f9bb55cb879c92f269b550f44fb1e7"},
    :status 400,
    :content-type :application/json,
    :uri "https://api.openai.com/v1/files",
    :content-type-params {},
    :version :http-1.1,
    :body
    "{\n  \"error\": {\n    \"message\": \"Additional properties are not allowed ('options' was unexpected)\",\n    \"type\": \"invalid_request_error\",\n    \"param\": null,\n    \"code\": null\n  }\n}\n"}

I hadn't tried this on 0.10.0 yet, so I don't think it's a regression.

Any ideas?

Support bb/sci

So, I'm using the library , so far so good. However when I try to use it in babashka script, I'm getting an error saying LinkedList is not supported in bb:

The bb.edn file:

{:paths ["bb"]
 ; todo: remove it
 :deps {net.clojars.wkok/openai-clojure {:mvn/version "0.16.0"}}}

The script:

(require '[wkok.openai-clojure.api :as api])

(defn- get-first-response [res]
  (-> res
      :choices
      first
      :message
      :content))

; TODO: I feel there needs to be a openai.clj that handles request 
; tally of tokens, response returning (what about specific edn schema?) and 
; maybe saving of history.
; 
; Also need a usr-contex thing, like a user-id so that I can tally token
; no matter in web or on cli.
(defn get-openai-response [sys-prompt, usr-prompt]
  (get-first-response
   (api/create-chat-completion
    {:model "gpt-3.5-turbo"
     :messages [{:role "system" :content sys-prompt}
                {:role "user" :content usr-prompt}]})))


(get-openai-response "you're a helpful assistant" "hello")

Error message:

bb tst.clj
----- Error --------------------------------------------------------------------
Type:     java.lang.Exception
Message:  Unable to resolve classname: java.util.LinkedList
Location: wkok/openai_clojure/sse.clj:10:3

----- Context ------------------------------------------------------------------
 6:    [clojure.core.async :as a]
 7:    [clojure.string :as string]
 8:    [cheshire.core :as json]
 9:    [clojure.core.async.impl.protocols :as impl])
10:   (:import (java.io InputStream)
      ^--- Unable to resolve classname: java.util.LinkedList
11:            (clojure.lang Counted)
12:            (java.util    LinkedList)))
13:
14: (def event-mask (re-pattern (str "(?s).+?\n\n")))
15:

----- Stack trace --------------------------------------------------------------
wkok.openai-clojure.sse   - wkok/openai_clojure/sse.clj:10:3
wkok.openai-clojure.azure - wkok/openai_clojure/azure.clj:2:3
wkok.openai-clojure.core  - wkok/openai_clojure/core.clj:2:3
wkok.openai-clojure.api   - wkok/openai_clojure/api.clj:2:3
user                      - /Users/jianghongying/code/conavi/tst.clj:1:1

Any thoughts on adding support to it? I believe the culprit must be how the LinkedList is used the first place.

Images using Azure

Hi thank you for the repo. Just confirming that I would need to figure out my own way to do images using my Azure key?

Thanks again

Namespace "Martian.encoders" not found

Hello, I am using openai-clojure version 0.16.0. When I evaluate my project with Cider, I get the following errors:

  1. first evaluation after jacking in: com.fasterxml.jackson.core.StreamWriteConstraints
  2. second evaluation after the previous one: no such variable json/encode
  3. third and after: namespace "martian.encoders" not found

Attached is my project.clj:

(defproject voting-records "0.1.0-SNAPSHOT"
  :description "FIXME: write description"
  :url "http://example.com/FIXME"
  :license {:name "EPL-2.0 OR GPL-2.0-or-later WITH Classpath-exception-2.0"
            :url "https://www.eclipse.org/legal/epl-2.0/"}
  :dependencies [[org.clojure/clojure "1.11.1"]
                 [environ "1.2.0"]
                 [com.cognitect.aws/api "0.8.686"]
                 [com.cognitect.aws/endpoints "1.1.12.504"]
                 [com.cognitect.aws/s3 "848.2.1413.0"]
                 [io.pinecone/pinecone-client "0.7.2"]
                 [com.taoensso/carmine  "3.3.2"]
                 [org.clojure/data.xml "0.2.0-alpha8"]
                 [cheshire "5.13.0"]
                 [org.apache.opennlp/opennlp-tools "2.3.2"]
                 [org.apache.pdfbox/pdfbox "3.0.1"]
                 [org.apache.pdfbox/io "3.0.0-alpha3"]
                 [com.brunobonacci/mulog-adv-console "0.9.0"]
                 [net.clojars.wkok/openai-clojure "0.16.0"]]
  :main ^:skip-aot voting-records.core
  :target-path "target/%s"
  :aliases {"kaocha" ["run" "-m" "kaocha.runner"]}
  :profiles {:uberjar {:aot :all
                       :jvm-opts ["-Dclojure.compiler.direct-linking=true"]}
             :test {:dependencies [[lambdaisland/kaocha "1.87.1366"]
                                   [org.clojure/test.check "0.9.0"]]}})

Wrong API version on Azure endpoint?

Hi when I request the API I get the following response:

:uri https://bali.openai.azure.com/openai/deployments/gpt-4-0613/chat/completions?api-version=2023-12-01-preview, :content-type-params {}, :version :http-1.1, :body {"error":{"code":"401","message":"Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource."}}}
; Caught an exception during OpenAI API call: Interceptor Exception: status: 401

I compared the two strings and the only difference is the API version:

Used by the API:
https://bali.openai.azure.com/openai/deployments/gpt-4-0613/chat/completions?api-version=2023-12-01-preview

Provided by azure:
https://bali.openai.azure.com/openai/deployments/gpt-4-0613/chat/completions?api-version=2023-03-15-preview

What am I missing? Here's the code:

response (api/create-chat-completion
                          {:messages [{:role "user" :content dynamic-message}]
                           :temperature 0.8
                           :max_tokens 4096
                           :top_p 1
                           :model "gpt-4-0613"
                           :frequency_penalty 0.0
                           :presence_penalty 0.0}
                          {:impl :azure
                           :api-key api-key
                           :api-endpoint api-endpoint})

The sliding buffer default size in sse is too short

Using function calls, maybe because it's a tad slower, I realized I often had problems when parsing streamed results, as some tokens would get dropped along the way.

(loop [s (<!! c)] ;; sse channel
  (let [fc (-> s :choices first :delta :function_call)
        x  (or (:name x) (:arguments x))]
    (when x
      (Thread/sleep (rand 100)) ;; simulate delay
      (print x)
      (flush)
      (recur (<!! c)))))

Explicitly passing :max_tokens 1000 fixes this issue.

(defn calc-buffer-size
"Buffer size should be at least equal to max_tokens
or 16 (the default in openai as of 2023-02-19)
plus the [DONE] terminator"
[{:keys [max_tokens]
:or {max_tokens 16}}]
(inc max_tokens))

I think its too low. My use case is fairly basic. Query, print as characters are appended to a StringBuffer then parse.

Also OpenAI API Reference states the following now, not 16:

max_tokens
integer or null
Optional
Defaults to inf
The maximum number of tokens to generate in the chat completion.

support multiple API versions ?

Maybe we should think about something "now" to assure future compatibility with new API versions.
(while staying at same time backwards compatible on Clojure level)

Maybe nothing is needed, just to be sure.
I somehow think that Azure will come with a new API version "soon", as current is "rather incomplete"

Default request timeout

What exactly is the the default timeout for the http client and how to override this value? I have seen requests getting stuck for more than than I would like.
Thanks!

support caching via redis ?

As the API calls cost money, I found it during "development" of any code calling openai,
very, very useful to have a persistent caching of the requests and answers

So I can at least re-run my code often without paying every time.
This can be done very cleanly using Redis and a Clojure implementation of "memoize" which persists into Redis.

This is (likely / maybe) not needed for "production" as "in production scenarios" I would not see repeated calls to openai with precisely the same parameters (prompts), at least not for "completion".

I would find it "usefull" to have it in this API, so I don't need to add it on top of this API, every time I use.
(and I would add it every time ...)

400 Error thrown due to max token length exceeded returns the request body when streaming

Steps to reproduce:

  1. Collect a sequence of messages to pass to open AI (of type {:role role :content content}) that exceed the model's token limit. In my case, GPT-3.5 turbo with 4097 tokens
  2. Hit the synchronous chat completions endpoint. I use OpenAI without Azure. You get an exception HTTP response with status 400 and the following body
:body
    "{\n  \"error\": {\n    \"message\": \"This model's maximum context length is 4097 tokens. However, your messages resulted in <more>. Please reduce the length of the messages"
  1. Repeat the same experiment, this time setting the :stream parameter to true. My setup doesn't have a handler in order to receive a channel back
  2. You should see a similar HTTP 400 exception thrown, but in the body you will see your request body rather than the helpful error message.

Attached below is the function I use to invoke the OpenAI endpoint:

(defn ask
 ([questions]
  (ask questions {} {}))
 ([questions payload-params]
  (ask questions payload-params {}))
 ([questions payload-params  opts]
  (let [payload (merge payload-params {:model "gpt-3.5-turbo"
                                       :messages questions
                                       :temperature 0})
        options (assoc opts :api-key openai-token)]
    (api/create-chat-completion payload options))))

In this case it was invoked as (ask messages {:stream true}

Thank you for making this library, it was tremendously helpful in avoiding a lot of really boring work.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.