Giter Site home page Giter Site logo

go-chatgpt-api's Issues

问题包含回车的话就报400

ERRO[364699] selenium error: panic, need to create a new session and refresh
[GIN] 2023/04/19 - 06:48:26 | 400 |   61.951324ms |       127.0.0.1 | POST     "/conversation"

Missing access token

一直报这个问题,重启后就好了,没弄明白原因

go-chatgpt-api          | INFO[9811] Missing access token

chatgpt-proxy-server容器启动失败

一直是Restarting状态
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
1773cb839b29 linweiyuan/chatgpt-proxy-server "./undetected_chrome…" 24 minutes ago Restarting (133) 40 seconds ago chatgpt-proxy-server
容器内部 报创建线程失败 pthread_create: Operation not permitted (1)
是我的vps内存太小了

启动容器后,看不到 8080 端口监听的进程

/app # ls
go-chatgpt-api
/app # netstat -npt
Active Internet connections (w/o servers)
Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name
tcp 0 0 172.17.0.6:42328 172.17.0.3:9515 ESTABLISHED 1/go-chatgpt-api

响应不及时

一般来说 回答的时候会显示 Stop Responding 按钮,但是结束以后会自动消失

但是反代用该项目的接口 会延迟十秒钟左右才消失,并且偶尔会有光标不跟着文字动一直停在最后右下角闪烁的情况(比较少)。

用社区两个接口就没有该问题,所以提到这里,看能不能复现,没有测试java版

invalid selector

不知道有没有遇到过这个问题

go-chatgpt-api          | INFO[42508] xpath lookup error: invalid selector: Failed to execute 'send' on 'XMLHttpRequest': Failed to load 'https://chat.openai.com/auth/login?next=/chat'.
go-chatgpt-api          |   (Session info: chrome=111.0.5563.64)
go-chatgpt-api          |   (Driver info: chromedriver=111.0.5563.64 (c710e93d5b63b7095afe8c2c17df34408078439d-refs/branch-heads/5563@{#995}),platform=Linux 3.10.0-1160.88.1.el7.x86_64 x86_64)

pthread_create: Operation not permitted (1)

[1681713279.028][SEVERE]: pthread_create: Operation not permitted (1)
Starting ChromeDriver 111.0.5563.64 (c710e93d5b63b7095afe8c2c17df34408078439d-refs/branch-heads/5563@{#995}) on port 9515
All remote connections are allowed. Use an allowlist instead!
Please see https://chromedriver.chromium.org/security-considerations for suggestions on keeping ChromeDriver safe.
[1681719833.469][SEVERE]: pthread_create: Operation not permitted (1)
docker 启动报的错误

在arm64的机器上不支持

大家好,我用了一台甲骨文的arm机器,发现chatgpt-proxy-server中archlinux的镜像不适用于arm,有什么解决方法吗🤔

Failed to handle captcha: timeout after 1m0.110652176s

image

这里显示Failed to handle captcha然后就Access denied了,但是我用这个服务器上的v2ray代理直接访问chat.openai.com可以正常登录上去。 没有挂warp,使用的配置是:

`version: "3"

services:
go-chatgpt-api:
container_name: go-chatgpt-api
image: linweiyuan/go-chatgpt-api
ports:
- 8080:8080 # 如果你需要暴露端口如一带多,可以取消注释
environment:
- GIN_MODE=release
- CHATGPT_PROXY_SERVER=http://chatgpt-proxy-server:9515
# - NETWORK_PROXY_SERVER=http://host:port
depends_on:
- chatgpt-proxy-server
restart: unless-stopped

chatgpt-proxy-server:
container_name: chatgpt-proxy-server
image: linweiyuan/chatgpt-proxy-server
restart: unless-stopped`

请问9515端口的这个docker启动不了,请问要怎么解决呢?

version: '3'


services:
  app:
    image: chenzhaoyu94/chatgpt-web # 总是使用 latest ,更新时重新 pull 该 tag 镜像即可
    ports:
      - 127.0.0.1:3002:3002
    environment:
      # 二选一
      OPENAI_API_KEY:
      # 二选一
      OPENAI_ACCESS_TOKEN: eyJhbGciOiJSUzI1NiIsInR5cCI6Ikpxxxxx
      # API接口地址,可选,设置 OPENAI_API_KEY 时可用
      OPENAI_API_BASE_URL:
      # API模型,可选,设置 OPENAI_API_KEY 时可用,https://platform.openai.com/docs/models
      # gpt-4, gpt-4-0314, gpt-4-32k, gpt-4-32k-0314, gpt-3.5-turbo, gpt-3.5-turbo-0301, text-davinci-003, text-davinci-002, code-davinci-002
      OPENAI_API_MODEL:
      # 反向代理,可选
      API_REVERSE_PROXY: http://go-chatgpt-api:8080/conversation
      # 访问权限密钥,可选
      AUTH_SECRET_KEY:
      # 每小时最大请求次数,可选,默认无限
      MAX_REQUEST_PER_HOUR: 0
      # 超时,单位毫秒,可选
      TIMEOUT_MS: 60000
      # Socks代理,可选,和 SOCKS_PROXY_PORT 一起时生效
      SOCKS_PROXY_HOST:
      # Socks代理端口,可选,和 SOCKS_PROXY_HOST 一起时生效
      SOCKS_PROXY_PORT:
      # HTTPS 代理,可选,支持 http,https,socks5
      HTTPS_PROXY:
    depends_on:
      - go-chatgpt-api


  go-chatgpt-api:
    container_name: go-chatgpt-api
    image: linweiyuan/go-chatgpt-api
    environment:
      - GIN_MODE=release
      - CHATGPT_PROXY_SERVER=http://chatgpt-proxy-server:9515
#      - NETWORK_PROXY_SERVER=http://host:port
#      - NETWORK_PROXY_SERVER=socks5://host:port
    depends_on:
      - chatgpt-proxy-server
    restart: unless-stopped


  chatgpt-proxy-server:
    container_name: chatgpt-proxy-server
    image: linweiyuan/chatgpt-proxy-server
    restart: unless-stopped

image

image

或者说是否可以切换到别的端口?

continue继续好像没有作用

得到的结果入下,像是不知道上下文了一样

该示例中,程序通过创建UDP套接字、设置目标地址和端口、发送DNS请求报文、接收DNS响应报文、解析DNS响应报文等步骤,完成了DNS解析的过程。需要注意的是,该示例中的DNS请求报文和响应报文采用了最基本的格式,实际上DNS协议的报Sure, what would you like me to continue with?

429 Too Many Requests

Client error '429 Too Many Requests' for url 'http://go-chatgpt-api:8080/conversation'
For more information check: https://httpstatuses.com/429

这是什么缘故呢?

响应data数据连在一个chunk包里了

data包中间可能没有加\r\n造成的?

{
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": ["\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\ufdata:{"message": {"id": "c1375983-8681-4978-959b-e5bfc39b157b", "author": {"role": "assistant", "name": null, "metadata": {}}, "create_time": 1681017612.273399, "update_time": null, "content": {"content_type": "text", "parts": ["\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n
                        },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n
                            }\n      return null;\n
                        },\n
                    },\n
                };\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b\uff0c\u6bcf\u4e2a\u9875\u9762\u90fd\u80fd\u72ec\u7acb\u7f13\u5b58\u81ea\u5df1\u7684"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b\uff0c\u6bcf\u4e2a\u9875\u9762\u90fd\u80fd\u72ec\u7acb\u7f13\u5b58\u81ea\u5df1\u7684input\u503c\u3002\u4f46\u9700\u8981\u6ce8\u610f\u7684\u662f\uff0cLocalStorage\u7f13\u5b58\u662f"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b\uff0c\u6bcf\u4e2a\u9875\u9762\u90fd\u80fd\u72ec\u7acb\u7f13\u5b58\u81ea\u5df1\u7684input\u503c\u3002\u4f46\u9700\u8981\u6ce8\u610f\u7684\u662f\uff0cLocalStorage\u7f13\u5b58\u662f\u57fa\u4e8e\u6d4f\u89c8\u5668"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b\uff0c\u6bcf\u4e2a\u9875\u9762\u90fd\u80fd\u72ec\u7acb\u7f13\u5b58\u81ea\u5df1\u7684input\u503c\u3002\u4f46\u9700\u8981\u6ce8\u610f\u7684\u662f\uff0cLocalStorage\u7f13\u5b58\u662f\u57fa\u4e8e\u6d4f\u89c8\u5668\u7684\uff0c\u4e0d\u540c\u6d4f\u89c8"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b\uff0c\u6bcf\u4e2a\u9875\u9762\u90fd\u80fd\u72ec\u7acb\u7f13\u5b58\u81ea\u5df1\u7684input\u503c\u3002\u4f46\u9700\u8981\u6ce8\u610f\u7684\u662f\uff0cLocalStorage\u7f13\u5b58\u662f\u57fa\u4e8e\u6d4f\u89c8\u5668\u7684\uff0c\u4e0d\u540c\u6d4f\u89c8\u5668\u95f4\u662f\u4e0d\u80fd\u5171\u4eab"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b\uff0c\u6bcf\u4e2a\u9875\u9762\u90fd\u80fd\u72ec\u7acb\u7f13\u5b58\u81ea\u5df1\u7684input\u503c\u3002\u4f46\u9700\u8981\u6ce8\u610f\u7684\u662f\uff0cLocalStorage\u7f13\u5b58\u662f\u57fa\u4e8e\u6d4f\u89c8\u5668\u7684\uff0c\u4e0d\u540c\u6d4f\u89c8\u5668\u95f4\u662f\u4e0d\u80fd\u5171\u4eab\u7f13\u5b58\u7684\uff0c\u4e5f\u4e0d"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b\uff0c\u6bcf\u4e2a\u9875\u9762\u90fd\u80fd\u72ec\u7acb\u7f13\u5b58\u81ea\u5df1\u7684input\u503c\u3002\u4f46\u9700\u8981\u6ce8\u610f\u7684\u662f\uff0cLocalStorage\u7f13\u5b58\u662f\u57fa\u4e8e\u6d4f\u89c8\u5668\u7684\uff0c\u4e0d\u540c\u6d4f\u89c8\u5668\u95f4\u662f\u4e0d\u80fd\u5171\u4eab\u7f13\u5b58\u7684\uff0c\u4e5f\u4e0d\u9002\u7528\u4e8e\u670d\u52a1\u5668\u7aef\u6e32\u67d3\u7b49\u573a"
            ]
        },
        "end_turn": null,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha"
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}data: {
    "message": {
        "id": "c1375983-8681-4978-959b-e5bfc39b157b",
        "author": {
            "role": "assistant",
            "name": null,
            "metadata": {}
        },
        "create_time": 1681017612.273399,
        "update_time": null,
        "content": {
            "content_type": "text",
            "parts": [
                "\u5728Vue\u4e2d\uff0c\u53ef\u4ee5\u4f7f\u7528\u6d4f\u89c8\u5668\u7684\u672c\u5730\u5b58\u50a8\uff08LocalStorage\uff09\u6765\u7f13\u5b58input\u7684\u503c\uff0c\u8fd9\u6837\u5373\u4f7f\u5237\u65b0\u9875\u9762\u6216\u8005\u5173\u95ed\u518d\u6253\u5f00\u9875\u9762\uff0c\u8f93\u5165\u6846\u4e2d\u7684\u503c\u4e5f\u80fd\u591f\u88ab\u4fdd\u7559\u4e0b\u6765\u3002\u4e0b\u9762\u662f\u4e00\u4e2a\u57fa\u4e8eLocalStorage\u7684\u7b80\u5355\u793a\u4f8b\uff1a\n\n1. \u5b9a\u4e49\u4e00\u4e2aMixin\uff0c\u5728\u5176\u4e2d\u5b9e\u73b0\u5bf9LocalStorage\u7684\u8bfb\u5199\u64cd\u4f5c\uff1a\n\n```\nconst CacheMixin = {\n  methods: {\n    setCache(key, value) {\n      localStorage.setItem(key, JSON.stringify(value));\n    },\n    getCache(key) {\n      const value = localStorage.getItem(key);\n      if (value) {\n        return JSON.parse(value);\n      }\n      return null;\n    },\n  },\n};\n```\n\n2. \u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff1a\n\n```\n<template>\n  <div>\n    <input type=\"text\" v-model=\"value\" />\n  </div>\n</template>\n\n<script>\nimport CacheMixin from \"@/mixins/cache\";\n\nexport default {\n  mixins: [CacheMixin],\n  data() {\n    return {\n      value: \"\",\n    };\n  },\n  mounted() {\n    const cacheValue = this.getCache(\"inputValue\");\n    if (cacheValue) {\n      this.value = cacheValue;\n    }\n  },\n  watch: {\n    value(newVal) {\n      this.setCache(\"inputValue\", newVal);\n    },\n  },\n};\n</script>\n```\n\n\u8fd9\u91cc\u6211\u4eec\u5b9a\u4e49\u4e86\u4e00\u4e2aCacheMixin\uff0c\u901a\u8fc7\u8c03\u7528getCache\u548csetCache\u65b9\u6cd5\u5b9e\u73b0\u8bfb\u5199LocalStorage\u3002\u5728input\u7ec4\u4ef6\u4e2d\u4f7f\u7528Mixin\uff0c\u5e76\u5728mounted\u751f\u547d\u5468\u671f\u4e2d\u8bfb\u53d6\u7f13\u5b58\u7684\u503c\u5e76\u8bbe\u7f6e\u5230input\u4e2d\uff0c\u5728input\u7684watch\u4e2d\u76d1\u542cvalue\u7684\u53d8\u5316\uff0c\u5e76\u5c06\u5176\u5b58\u5165LocalStorage\u4e2d\u3002\n\n\u8fd9\u6837\u5373\u4f7f\u5728\u5f00\u542f\u591a\u4e2a\u76f8\u540c\u7f51\u9875\u9875\u9762\u7684\u60c5\u51b5\u4e0b\uff0c\u6bcf\u4e2a\u9875\u9762\u90fd\u80fd\u72ec\u7acb\u7f13\u5b58\u81ea\u5df1\u7684input\u503c\u3002\u4f46\u9700\u8981\u6ce8\u610f\u7684\u662f\uff0cLocalStorage\u7f13\u5b58\u662f\u57fa\u4e8e\u6d4f\u89c8\u5668\u7684\uff0c\u4e0d\u540c\u6d4f\u89c8\u5668\u95f4\u662f\u4e0d\u80fd\u5171\u4eab\u7f13\u5b58\u7684\uff0c\u4e5f\u4e0d\u9002\u7528\u4e8e\u670d\u52a1\u5668\u7aef\u6e32\u67d3\u7b49\u573a\u666f\u3002"
            ]
        },
        "end_turn": true,
        "weight": 1.0,
        "metadata": {
            "message_type": "next",
            "model_slug": "text-davinci-002-render-sha",
            "finish_details": {
                "type": "stop",
                "stop": "<|im_end|>"
            }
        },
        "recipient": "all"
    },
    "conversation_id": "0d4755ce-762c-4b3d-a3c3-8b95e1af4138",
    "error": null
}

429

新版本好像一直报429

Attaching to go-chatgpt-api, chatgpt-proxy-server
go-chatgpt-api          | INFO[0024] Welcome to ChatGPT
go-chatgpt-api          | [GIN] 2023/04/12 - 10:41:49 | 429 |  777.417411ms |       127.0.0.1 | POST     "/conversation"
go-chatgpt-api          | [GIN] 2023/04/12 - 10:42:03 | 429 |   1.21624546s |       127.0.0.1 | POST     "/conversation"
go-chatgpt-api          | [GIN] 2023/04/12 - 10:43:10 | 200 |  473.172461ms |       127.0.0.1 | GET      "/conversations?offset=0&limit=1"
go-chatgpt-api          | [GIN] 2023/04/12 - 10:43:13 | 200 |  2.553418328s |       127.0.0.1 | GET      "/conversation/0d4755ce-762c-4b3d-a3c3-8b95e1af4138"
go-chatgpt-api          | INFO[0156] 413[object Object]
go-chatgpt-api          | ERRO[0156] invalid character '[' after top-level value
go-chatgpt-api          | INFO[0156] 413[object Object]
go-chatgpt-api          | ERRO[0156] invalid character '[' after top-level value
go-chatgpt-api          | INFO[0156] 413[object Object]
go-chatgpt-api          | ERRO[0156] invalid character '[' after top-level value
go-chatgpt-api          | INFO[0156] 413[object Object]
go-chatgpt-api          | ERRO[0156] invalid character '[' after top-level value
go-chatgpt-api          | INFO[0156] 413[object Object]
go-chatgpt-api          | ERRO[0156] invalid character '[' after top-level value

希望加个超时判断

有的时候会出现卡死,调用api无反应。其中一个可以复现的情况是在使用过期的accesstoken连接后会直接卡死,之后用没过期的accesstoken调用就会504,只能重启docker容器。

Checking captcha

INFO[0631] Checking captcha
ERRO[0646] Failed to handle captcha: timeout after 15.289259508s
ERRO[0646]
ERRO[0646] <script>window['__CF$cv$params']={r:'793e0136a57ef973',m:'D9aoBgw6nXhTwDNsh1zUqiBvkHfsE.xvwl9aKYVnX4w-1675457068-0-AQDZ6VQlzLEk7EqLco1GkT5ydLmrle2ABYqrehe3o61K/10yAkzQLgFUQQUpnH57r16DzfjFt59EoEUqVRHOB5Ml6eut8CvvkaSwi0+5Kk1+TZrh+dOyMhFp5lHSRhbu9HF75amCEs1hwX4T22w+CsU=',s:[0x2ccf6e6f29,0xddf5dd5974],u:'/cdn-cgi/challenge-platform/h/b'};var now=Date.now()/1000,offset=14400,ts=''+(Math.floor(now)-Math.floor(now%offset)),_cpo=document.createElement('script');_cpo.nonce='',_cpo.src='/cdn-cgi/challenge-platform/h/b/scripts/alpha/invisible.js?ts='+ts,document.getElementsByTagName('head')[0].appendChild(_cpo);</script><script src="/cdn-cgi/challenge-platform/h/b/scripts/alpha/invisible.js?ts=1681689600"></script>

invalid session id

After running for a period of time, the following error is reported

go-chatgpt-api          | [GIN] 2023/04/02 - 07:31:52 | 200 |   28.805171ms |       127.0.0.1 | POST     "/conversation"
go-chatgpt-api          | ERRO[1167] invalid session id: invalid session id
go-chatgpt-api          |
go-chatgpt-api          |
go-chatgpt-api          | 2023/04/02 07:32:25 [Recovery] 2023/04/02 - 07:32:25 panic recovered:
go-chatgpt-api          | interface conversion: interface {} is nil, not string
go-chatgpt-api          | /usr/local/go/src/runtime/iface.go:262 (0x40a989)
go-chatgpt-api          | /app/api/conversation/conversation.go:33 (0x7ae20f)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/context.go:174 (0x7af95a)
go-chatgpt-api          | /app/middleware/precheck.go:44 (0x7af867)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/context.go:174 (0x79aa81)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/recovery.go:102 (0x79aa6c)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/context.go:174 (0x79aa81)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/recovery.go:102 (0x79aa6c)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/context.go:174 (0x799ba6)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/logger.go:240 (0x799b89)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/context.go:174 (0x798c2a)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/gin.go:620 (0x7988b1)
go-chatgpt-api          | /go/pkg/mod/github.com/gin-gonic/[email protected]/gin.go:576 (0x79855c)
go-chatgpt-api          | /usr/local/go/src/net/http/server.go:2936 (0x654075)
go-chatgpt-api          | /usr/local/go/src/net/http/server.go:1995 (0x650991)
go-chatgpt-api          | /usr/local/go/src/runtime/asm_amd64.s:1598 (0x466960)

貌似陷入了一个死循环

INFO[0012] Checking captcha
INFO[0012] Captcha is clicked!
2023/04/07 09:11:16 Failed to handle captcha, looks like infinite loop, please remove CHATGPT_PROXY_SERVER to use API mode first until I find a way to fix it.

这个配置文件里少了端口映射

services:
go-chatgpt-api:
container_name: go-chatgpt-api
image: linweiyuan/go-chatgpt-api
environment:
- GIN_MODE=release
- CHATGPT_PROXY_SERVER=http://chatgpt-proxy-server:9515
- NETWORK_PROXY_SERVER=socks5://chatgpt-proxy-server-warp:65535
depends_on:
- chatgpt-proxy-server
- chatgpt-proxy-server-warp
restart: unless-stopped

chatgpt-proxy-server:
container_name: chatgpt-proxy-server
image: linweiyuan/chatgpt-proxy-server
restart: unless-stopped

chatgpt-proxy-server-warp:
container_name: chatgpt-proxy-server-warp
image: linweiyuan/chatgpt-proxy-server-warp
restart: unless-stopped

程序运行一段时间后提示访问无效的内存地址或空指针

没有使用docker

环境如下
Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-144-generic x86_64)

下面是报错日记

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x138 pc=0x7af54b]

goroutine 10 [running]:
github.com/linweiyuan/go-chatgpt-api/webdriver.Refresh.func1()
        /home/go-chatgpt-api/webdriver/refresh.go:13 +0x2b
created by github.com/linweiyuan/go-chatgpt-api/webdriver.Refresh
        /home/go-chatgpt-api/webdriver/refresh.go:12 +0x6a
exit status 2

ERRO[33236] invalid character 'b' after top-level value

INFO[33236] 8bb8\u591a\u56e0\u7d20\u7684\u5f71\u54cd\uff0c\u5305\u62ec\u5bb6\u5ead\u3001\u6559\u80b2\u3001\u793e\u4f1a\u73af\u5883\u7b49\u3002\u547d\u8fd0\u5e76\u975e\u7531\u5355\u4e00\u56e0\u7d20\u51b3\u5b9a\uff0c\u56e0\u6b64\u4e0d\u80fd\u7b80\u5355\u5730\u5c06\u547d\u8fd0\u591a\u574e\u5777\u4e0e\u592a\u521a\u7684\u6027\u683c\u8054\u7cfb\u8d77\u6765\u3002\n\n\u5728\u73b0\u5b9e\u751f\u6d3b\u4e2d\uff0c\u5b66\u4f1a\u8c03\u6574\u81ea\u5df1\u7684\u6027\u683c\u3001\u884c\u4e3a\u548c\u6c9f\u901a\u65b9\u5f0f\uff0c\u4ee5\u9002\u5e94\u4e0d\u540c\u7684\u60c5\u5883\uff0c\u662f\u5f88\u91cd\u8981\u7684\u3002\u8fd9\u6837\u53ef\u4ee5\u5e2e\u52a9\u6211\u4eec\u5728\u9762\u5bf9\u56f0\u96be\u65f6\u66f4\u52a0\u575a\u5f3a\uff0c\u540c\u65f6\u5728\u5904\u7406\u4eba\u9645\u5173\u7cfb\u65f6\u66f4\u52a0\u5706\u6ed1\u3002\u8fd9\u6837\uff0c\u6211\u4eec\u624d\u80fd\u66f4\u597d\u5730\u5e94"]}, "end_turn": null, "weight": 1.0, "metadata": {"message_type": "next", "model_slug": "gpt-4"}, "recipient": "all"}, "conversation_id": "5ce479c0-3313-4238-a974-d92366ba4a7a", "error": null}
ERRO[33236] invalid character 'b' after top-level value
这个影响了回复速度

{"errorMessage":"Missing accessToken."}

我搭建好后,直接访问http://ip:8080时会出现:

{"errorMessage":"Missing accessToken."}

请问README.md中是否有遗漏什么?完全按第一个docker-compose配置部署的。求解答。

谢谢大佬 (ฅ´ω`ฅ)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.