lanqian528 / chat2api Goto Github PK
View Code? Open in Web Editor NEWA service that can convert ChatGPT on the web to OpenAI API format.
License: MIT License
A service that can convert ChatGPT on the web to OpenAI API format.
License: MIT License
支持all gpts的文件上传么
访问/tokens显示404
已知问题:openai开启tls指纹检测,httpx无法通过。
过段时间会更新,有改好的欢迎pr
wscat --connect wss://chatgpt-async-webps-prod-southcentralus-5.chatgpt.com/client/hubs/conversations?access_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJodHRwczovL2NoYXRncHQtYXN5bmMtd2VicHMtcHJvZC1zb3V0aGNlbnRyYWx1cy01LndlYnB1YnN1Yi5henVyZS5jb20vY2xpZW50L2h1YnMvY29udmVyc2F0aW9ucyIsImlhdCI6MTcxNDE4NDY2NSwiZXhwIjoxNzE0MTg4MjY1LCJzdWIiOiJ1c2VyLVNFR2JVZE53eW93VjRKS2tKSnR3MHUycSIsInJvbGUiOlsid2VicHVic3ViLmpvaW5MZWF2ZUdyb3VwLnVzZXItU0VHYlVkTnd5b3dWNEpLa0pKdHcwdTJxIl0sIndlYnB1YnN1Yi5ncm91cCI6WyJ1c2VyLVNFR2JVZE53eW93VjRKS2tKSnR3MHUycSJdfQ.lbHLjqxv6b2zGqGEHlX2vhtyz2QeSL_2ra20pVL9g3A
error: Unexpected server response: 403
使用curl_cffi websocket都不能解决:
curl_cffi.curl.CurlError: Failed to perform, curl: (22) Refused WebSockets upgrade: 403. See https://curl.se/libcurl/c/libcurl-errors.html first for more details.
看来需要进一步研究为什么导致403。
CHATGPT_BASE_URL: ['https://chatgpt.com']
有没大佬试过CF或vercel来转发CHATGPT_BASE_URL参数,使得没魔法也可以使用。
Unknown request URL: POST /backend-api/sentinel/chat-requirements. Please check the URL for typos, or see the docs at https://platform.openai.com/docs/api-reference/.",
日志是
2024-05-28 19:37:44,984 | INFO | arkose_token: {'msg': 'success', 'variant': None, 'solved': True, 'token': '22817d3a40e5c1316.3582779602|r=us-west-2|meta=3|metabgclr=transparent|metaiconclr=%23757575|guitextcolor=%23000000|pk=35536E1E-65B4-4D96-9D97-6ADB7EFF8147|at=40|sup=1|rid=87|ag=101|cdn_url=https%3A%2F%2Ftcr9i.chat.openai.com%2Fcdn%2Ffc|lurl=https%3A%2F%2Faudio-us-west-2.arkoselabs.com|surl=https%3A%2F%2Ftcr9i.chat.openai.com|smurl=https%3A%2F%2Ftcr9i.chat.openai.com%2Fcdn%2Ffc%2Fassets%2Fstyle-manager', 'waves': 0, 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/113.0.0.0 Safari/537.36 Edg/113.0.0.0', 'proxy': None}
2024-05-28 19:37:44,984 | INFO | Found a file_url from messages: https://www.tomchat.shop/uploads/1716895639806.jpg
2024-05-28 19:37:46,797 | INFO | file_id: file-z2NA4yhMh4ZDmLl1GrvVzb3Y, upload_url: https://files.oaiusercontent.com/file-z2NA4yhMh4ZDmLl1GrvVzb3Y?se=2024-05-28T11%3A42%3A46Z&sp=cw&sv=2023-11-03&sr=b&sig=yHdpz1/fwuggnkKLE2iXEfPx/02utzFUsCZpWcqZ588%3D
2024-05-28 19:37:50,178 | INFO | File_meta: {'file_id': 'file-z2NA4yhMh4ZDmLl1GrvVzb3Y', 'file_name': 'd2a090c8-dd39-4dbc-b764-c19d1d790ebf.jpg', 'size_bytes': 4850018, 'mime_type': 'image/jpeg', 'width': 4032, 'height': 3024}
2024-05-28 19:37:50,180 | INFO | Model mapping: gpt-4o -> gpt-4o
2024-05-28 19:37:51,462 | INFO | 5.34.216.89:62748: POST /v1/chat/completions HTTP/1.1 400 Bad Request
root@RainYun-zGEJmJFA:~#
===============
docker compose 已经添加 UPLOAD_BY_URL = true
没有url的时候一切 回复 正常
2024-05-24 02:17:18,226 | INFO | ------------------------------------------------------------
2024-05-24 02:17:18,227 | INFO | Chat2Api v1.1.7 | https://github.com/lanqian528/chat2api
2024-05-24 02:17:18,227 | INFO | ------------------------------------------------------------
2024-05-24 02:17:18,228 | INFO | Environment variables:
2024-05-24 02:17:18,228 | INFO | API_PREFIX: None
2024-05-24 02:17:18,228 | INFO | AUTHORIZATION: []
2024-05-24 02:17:18,229 | INFO | CHATGPT_BASE_URL: ['https://chatgpt.com']
2024-05-24 02:17:18,229 | INFO | ARKOSE_TOKEN_URL: ['http://arkose:60233/token']
2024-05-24 02:17:18,229 | INFO | PROXY_URL: []
2024-05-24 02:17:18,230 | INFO | HISTORY_DISABLED: False
2024-05-24 02:17:18,230 | INFO | POW_DIFFICULTY: 000032
2024-05-24 02:17:18,230 | INFO | RETRY_TIMES: 3
2024-05-24 02:17:18,231 | INFO | ENABLE_GATEWAY: True
2024-05-24 02:17:18,231 | INFO | CONVERSATION_ONLY: False
2024-05-24 02:17:18,231 | INFO | ENABLE_LIMIT: True
2024-05-24 02:17:18,231 | INFO | UPLOAD_BY_URL: False
2024-05-24 02:17:18,232 | INFO | ------------------------------------------------------------
2024-05-24 02:17:18,254 | INFO | Token list count: 2
2024-05-24 02:17:18,277 | INFO | Started server process [1]
2024-05-24 02:17:18,278 | INFO | Waiting for application startup.
2024-05-24 02:17:18,284 | INFO | Adding job tentatively -- it will be properly scheduled when the scheduler starts
2024-05-24 02:17:18,285 | INFO | Added job "clean_dict" to job store "default"
2024-05-24 02:17:18,285 | INFO | Scheduler started
2024-05-24 02:17:18,286 | INFO | Application startup complete.
2024-05-24 02:17:18,290 | INFO | Uvicorn running on http://0.0.0.0:5005 (Press CTRL+C to quit)
2024-05-24 02:17:31,372 | INFO | 172.19.1.9:54722: GET / HTTP/1.1 500 Internal Server Error
2024-05-24 02:17:32,035 | INFO | 172.19.1.9:54724: GET /api/auth/session HTTP/1.1 500 Internal Server Error
2024-05-24 02:17:32,431 | INFO | 172.19.1.9:54726: GET /backend-anon/accounts/check/v4-2023-04-27?timezone_offset_min=-480 HTTP/1.1 401 Unauthorized
2024-05-24 02:17:32,875 | INFO | 172.19.1.9:54730: GET /ces/v1/projects/oai/settings HTTP/1.1 200 OK
2024-05-24 02:17:33,226 | INFO | 172.19.1.9:54732: POST /ces/v1/p HTTP/1.1 200 OK
2024-05-24 02:17:33,266 | INFO | 172.19.1.9:54734: POST /ces/v1/t HTTP/1.1 200 OK
使用 docker-compose 部署:
version: '3'
services:
chat2api:
image: lanqian528/chat2api:latest
container_name: chat2api
restart: unless-stopped
network_mode: chat2api
ports:
- '60234:5005'
volumes:
- chat2api-data:/app/data # 挂载一些需要保存的数据
environment:
- ARKOSE_TOKEN_URL=http://arkose:60233/token
- HISTORY_DISABLED=false
- TZ=Asia/Shanghai
arkose:
image: lanqian528/funcaptcha_solver:latest
container_name: funcaptcha_solver
restart: unless-stopped
network_mode: chat2api
ports:
- '60233:5006'
# 自定义网络接口
networks:
chat2api:
name: chat2api
driver: bridge
ipam:
config:
- subnet: 172.19.1.8/29
gateway: 172.19.1.9
# 自定义储存卷
volumes:
chat2api-data:
name: chat2api-data
host IP 在加利福尼亚,不管是设定 TZ 还是不设定 TZ 都是同样的报错。访问方式是 Openresty 反代,中间有经过 CDN。
openai is using wss now, please kindly help to support wss as well. Thank you very much!
Error Type:
AttributeError
Error Message:
module 'orjson' has no attribute 'JSONEncoder'
Logs:
time="2024-05-25T06:35:15Z" level=info msg="serving logs listener on sandbox.localdomain:1234" agent=logsApiAgent
TELEMETRY Name: telemetry-extension State: Already subscribed Types: [Function]
[ERROR] AttributeError: module 'orjson' has no attribute 'JSONEncoder'
Traceback (most recent call last):
File "/var/lang/lib/python3.9/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 986, in _find_and_load_unlocked
File "", line 680, in _load_unlocked
File "", line 850, in exec_module
File "", line 228, in _call_with_frames_removed
File "/var/task/_entry.py", line 1, in
from detalib.handler import handle
File "/opt/python/detalib/handler.py", line 3, in
from . import handlers
File "/opt/python/detalib/handlers.py", line 1, in
from .event import parse_event
File "/opt/python/detalib/event.py", line 1, in
from .helpers import json
File "/opt/python/detalib/helpers.py", line 16, in
class CustomJSONEncoder(json.JSONEncoder):
Stack Trace:
File "/var/lang/lib/python3.9/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 986, in _find_and_load_unlocked
File "", line 680, in _load_unlocked
File "", line 850, in exec_module
File "", line 228, in _call_with_frames_removed
File "/var/task/_entry.py", line 1, in
from detalib.handler import handle
File "/opt/python/detalib/handler.py", line 3, in
from . import handlers
File "/opt/python/detalib/handlers.py", line 1, in
from .event import parse_event
File "/opt/python/detalib/event.py", line 1, in
from .helpers import json
File "/opt/python/detalib/helpers.py", line 16, in
class CustomJSONEncoder(json.JSONEncoder):
/tokens的管理,如果token失效会自动踢除不用于轮询吗?
或者有没有token失败尝试另一个token的机制
感谢这个牛B的项目👍
点击部署到zeabur后无法访问:
报错信息如下:
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 265, in __call__
await wrap(partial(self.listen_for_disconnect, receive))
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
await func()
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 238, in listen_for_disconnect
message = await receive()
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 535, in receive
await self.message_event.wait()
File "/usr/local/lib/python3.11/asyncio/locks.py", line 213, in wait
await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 7fbeaeb6a890
During handling of the above exception, another exception occurred:
+ Exception Group Traceback (most recent call last):
| File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 407, in run_asgi
| result = await app( # type: ignore[func-returns-value]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
| return await self.app(scope, receive, send)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
| await super().__call__(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
| await self.middleware_stack(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
| await self.app(scope, receive, _send)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
| await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
| await app(scope, receive, sender)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
| await self.middleware_stack(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
| await route.handle(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
| await self.app(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
| await wrap_app_handling_exceptions(app, request)(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
| await app(scope, receive, sender)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
| await response(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 258, in __call__
| async with anyio.create_task_group() as task_group:
| File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__
| raise BaseExceptionGroup(
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
| await func()
| File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
| async for chunk in self.body_iterator:
| File "/app/chatgpt/ChatService.py", line 224, in send_conversation_for_stream
| async with self.session.stream("POST", url, headers=self.headers, json=self.chat_request) as r:
| File "/usr/local/lib/python3.11/contextlib.py", line 210, in __aenter__
| return await anext(self.gen)
| ^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/curl_cffi/requests/session.py", line 980, in stream
| rsp = await self.request(*args, **kwargs, stream=True)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/curl_cffi/requests/session.py", line 1027, in request
| self._check_session_closed()
| File "/usr/local/lib/python3.11/site-packages/curl_cffi/requests/session.py", line 579, in _check_session_closed
| raise SessionClosed("Session is closed, cannot send request.")
| curl_cffi.requests.errors.SessionClosed: Session is closed, cannot send request.
+------------------------------------
INFO: 172.17.0.1:41566 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/chat2api.py", line 30, in send_conversation
return JSONResponse(await chat_service.send_conversation(data), media_type="application/json")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/chatgpt/ChatService.py", line 247, in send_conversation
resp = (await r.atext()).split("\n")
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/curl_cffi/requests/models.py", line 204, in atext
return self._decode(await self.acontent())
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/curl_cffi/requests/models.py", line 209, in acontent
async for chunk in self.aiter_content():
File "/usr/local/lib/python3.11/site-packages/curl_cffi/requests/models.py", line 191, in aiter_content
raise chunk
curl_cffi.requests.errors.RequestsError: Failed to perform, curl: (18) . See https://curl.se/libcurl/c/libcurl-errors.html first for more details.
大佬们这个怎么解决啊
想知道如何在接口传入文件呢?目前只知道如何传入图片,如果按照图片的逻辑,ChatGPT好像无法识别文件
# 接入网关地址
CHATPROXY: "https://demo.xyhelper.cn"
# 接入网关的authkey
AUTHKEY: "xyhelper"
这种形式的
AUTHORIZATION的token是指access token吗 我填或者不填都是401 ,请教一下该怎么配置,代理我确认了没问题
你好,我遇到了 400
的问题,请问应该如何设置 PROXY_URL=your_first_proxy, your_second_proxy
?这里的 your_first_proxy
应该填什么?谢谢!
Traceback (most recent call last):
File "/app/app.py", line 4, in
from chatgpt.ChatService import ChatService
File "/app/chatgpt/ChatService.py", line 13, in
from utils.config import history_disabled, free35_base_url_list, proxy_url_list
File "/app/utils/config.py", line 9, in
authorization = os.getenv('AUTHORIZATION').replace(' ', '')
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'replace'
比如 用 playwright 驱动 chrome 来请求 openai 避开 cf 盾
昨天运行正常,今天运行失败。
重新下载也不行:
$docker run -d \
--name chat2api \
-p 5005:5005 \
lanqian528/chat2api:latest
$docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
958d58033063 lanqian528/chat2api:latest "python app.py" 23 seconds ago Up 22 seconds 0.0.0.0:5005->5005/tcp chat2api
$curl --location 'http://127.0.0.1:5005/v1/chat/completions' \
--header 'Content-Type: application/json' \
--data '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"stream": true
}'
{"detail":"a bytes-like object is required, not 'str'"}
感谢作者开发。使用中遇到了这样的问题:
通过docker compose 搭建项目后,使用内网IP地址可以正常使用,但使用cloudflare tunnel通过域名访问时,显示"Unable to load site Please try again later. If you are using a VPN, try turning it off. Check the status page for information on outages.".
目前咱们这个项目gpt-4o模型 支持上传文件 识图吗?
填写了 AUTHORIZATION 一直都是 {"detail":"Unauthorized"}
不管是用45位的 refresh_token
还是用 eyJhbGciOi
开头的 accessToken
都是报 {"detail":"Unauthorized"}
在使用时我查看日志会出现 Model mapping: gpt-3.5-turbo -> text-davinci-002-render-sha,请问这个正常吗?我该如何直接使用gpt-3.5-turbo?我是非plus用户,在oneapi里使用accesstoken接入的chat2api
错误401,
{"detail":"Not authenticated"}
想问一下,是国内ip的问题吗?
首先感谢大佬的无私奉献,然后提一点建议:
希望对rt出现错误进行处理,查看日志当有rt由于密码错误或者账号被封失效以后,每次轮询到该rt时都会尝试进行at获取,当设置了重试次数以后则会导致频繁请求rt更新at失败。
建议增加rt失效以后从原token.txt中删除对应rt,并且从缓存中删除对应rt以防出现上面的情况,如果想避免误删可进行对删除的token进行备份操作,保存到相应的txt中
since we might copy a invalid access token from firefox (string was truncated and a unicode character '...' is added inside the string) .
adding encoding='ascii' in load_dotenv() can add more check on a valid access token, which avoid weird issue to debug.
Thanks for your project.
funcaptcha_solver | http://127.0.0.1:5006/token
funcaptcha_solver | * Serving Flask app 'app'
funcaptcha_solver | * Debug mode: off
funcaptcha_solver | 2024-04-21 21:28:00,322 | INFO | WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
funcaptcha_solver | * Running on all addresses (0.0.0.0)
funcaptcha_solver | * Running on http://127.0.0.1:5006
funcaptcha_solver | * Running on http://172.18.0.3:5006
funcaptcha_solver | 2024-04-21 21:28:00,322 | INFO | Press CTRL+C to quit
chat2api | 2024-04-21 21:28:00,506 | INFO | Environment variables (no AUTHORIZATION): chat2api | 2024-04-21 21:28:00,507 | INFO | CHATGPT_BASE_URL: ['https://chat.openai.com']
chat2api | 2024-04-21 21:28:00,507 | INFO | ARKOSE_TOKEN_URL: ['http://arkose:5006/token']
chat2api | 2024-04-21 21:28:00,507 | INFO | PROXY_URL: [] chat2api | 2024-04-21 21:28:00,507 | INFO | HISTORY_DISABLED: True chat2api | 2024-04-21 21:28:00,507 | INFO | RETRY_TIMES: 3
chat2api | 2024-04-21 21:28:00,557 | INFO | Started server process [1]
chat2api | 2024-04-21 21:28:00,558 | INFO | Waiting for application startup.
chat2api | 2024-04-21 21:28:00,558 | INFO | Application startup complete.
chat2api | 2024-04-21 21:28:00,558 | INFO | Uvicorn running on http://0.0.0.0:5005 (Press CTRL+C to quit)
chat2api | 2024-04-21 21:28:08,541 | INFO | 172.18.0.1:55804: POST /v1/chat/completions HTTP/1.1 500 Internal Server Error
chat2api | 2024-04-21 21:28:08,543 | ERROR | Exception in ASGI application
chat2api | Traceback (most recent call last):
chat2api | File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 779, in urlopen
chat2api | self._prepare_proxy(conn)
chat2api | File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 1048, in _prepare_proxy
chat2api | conn.connect()
chat2api | File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 633, in connect
chat2api | self._tunnel() # type: ignore[attr-defined]
chat2api | ^^^^^^^^^^^^^^
chat2api | File "/usr/local/lib/python3.11/http/client.py", line 943, in _tunnel
chat2api | raise OSError(f"Tunnel connection failed: {code} {message.strip()}")
chat2api | OSError: Tunnel connection failed: 403 Access denied
chat2api |
chat2api | The above exception was the direct cause of the following exception:
chat2api |
chat2api | urllib3.exceptions.ProxyError: ('Unable to connect to proxy', OSError('Tunnel connection failed: 403 Access denied'))
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 265, in __call__
await wrap(partial(self.listen_for_disconnect, receive))
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
await func()
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 238, in listen_for_disconnect
message = await receive()
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 535, in receive
await self.message_event.wait()
File "/usr/local/lib/python3.11/asyncio/locks.py", line 213, in wait
await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 7fc71594f010
During handling of the above exception, another exception occurred:
+ Exception Group Traceback (most recent call last):
| File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 407, in run_asgi
| result = await app( # type: ignore[func-returns-value]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
| return await self.app(scope, receive, send)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
| await super().__call__(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
| await self.middleware_stack(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
| await self.app(scope, receive, _send)
| File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
| await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
| await app(scope, receive, sender)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
| await self.middleware_stack(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
| await route.handle(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
| await self.app(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
| await wrap_app_handling_exceptions(app, request)(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
| raise exc
| File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
| await app(scope, receive, sender)
| File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
| await response(scope, receive, send)
| File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 258, in __call__
| async with anyio.create_task_group() as task_group:
| File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__
| raise BaseExceptionGroup(
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
| await func()
| File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
| async for chunk in self.body_iterator:
| File "/app/chatgpt/ChatService.py", line 228, in send_conversation_for_stream
| await self.session.post(url, headers=self.headers, json=self.chat_request, stream=True)) as r:
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/curl_cffi/requests/session.py", line 1027, in request
| self._check_session_closed()
| File "/usr/local/lib/python3.11/site-packages/curl_cffi/requests/session.py", line 579, in _check_session_closed
| raise SessionClosed("Session is closed, cannot send request.")
| curl_cffi.requests.errors.SessionClosed: Session is closed, cannot send request.
+------------------------------------
在本地部署后访问网页报错:
你能帮我设计一个用于教授基础编程技能的游戏概念吗?先询问我希望使用哪种编程语言。
ChatGPT
!
Unusual activity has been detected from your device. Try again later. (8852dd628a982ee7-LAX)
===== Application Startup at 2024-05-21 09:23:40 =====
2024-05-21 09:23:47,369 | INFO | ------------------------------------------------------------
2024-05-21 09:23:47,370 | INFO | Chat2Api v1.1.2 | https://github.com/lanqian528/chat2api
2024-05-21 09:23:47,370 | INFO | ------------------------------------------------------------
2024-05-21 09:23:47,370 | INFO | Environment variables:
2024-05-21 09:23:47,370 | INFO | API_PREFIX: yyds
2024-05-21 09:23:47,370 | INFO | AUTHORIZATION: []
2024-05-21 09:23:47,370 | INFO | CHATGPT_BASE_URL: ['https://chatgpt.com']
2024-05-21 09:23:47,370 | INFO | ARKOSE_TOKEN_URL: []
2024-05-21 09:23:47,370 | INFO | PROXY_URL: []
2024-05-21 09:23:47,370 | INFO | HISTORY_DISABLED: True
2024-05-21 09:23:47,370 | INFO | POW_DIFFICULTY: 000032
2024-05-21 09:23:47,370 | INFO | RETRY_TIMES: 3
2024-05-21 09:23:47,370 | INFO | ENABLE_GATEWAY: True
2024-05-21 09:23:47,370 | INFO | CONVERSATION_ONLY: False
2024-05-21 09:23:47,370 | INFO | ------------------------------------------------------------
Traceback (most recent call last):
File "/app/app.py", line 9, in
uvicorn.run("chat2api:app", host="0.0.0.0", port=5005)
File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 575, in run
server.run()
File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 65, in run
return asyncio.run(self.serve(sockets=sockets))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 69, in serve
await self._serve(sockets)
File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 76, in _serve
config.load()
File "/usr/local/lib/python3.11/site-packages/uvicorn/config.py", line 433, in load
self.loaded_app = import_from_string(self.app)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/importer.py", line 19, in import_from_string
module = importlib.import_module(module_str)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/app/chat2api.py", line 14, in
from utils.authorization import verify_token, token_list
File "/app/utils/authorization.py", line 6, in
from chatgpt.refreshToken import rt2ac
File "/app/chatgpt/refreshToken.py", line 18, in
os.makedirs(DATA_FOLDER)
File "", line 225, in makedirs
PermissionError: [Errno 13] Permission denied: 'data'
直到占满20多g的内存死机
HISTORY_DISABLED已经设置false
ArkoseToken 如何获取
arkose:
image: lanqian528/funcaptcha_solver:latest
container_name: funcaptcha_solver
restart: unless-stopped
ports:
- '5006:5006'
https://github.com/Ink-Osier/GenerateArkose
各显神通 目前哪个项目比较好用?
最低8token的情况下 响应时间也需要8秒多
再开了代理池的情况下 14秒多
各位都有碰到这种情况吗 是怎么解决的 给支个招呗
非常感谢大佬的项目。 我在nextchat发图给gpt-4o,回复无法查看图片,请问是chat2api不支持图片吗?
用refreshcookie做token时,返回错误:
{
"detail": "Could not parse your authentication token. Please try signing in again."
}
请问现在是否可以直接用refreshcookie做token?谢谢
正在验证您是否是真人。这可能需要几秒钟时间。
验证所用的时间比预期时间长。如果问题仍然存在,请检查您的 Internet 连接并刷新页面。
有没有方法在项目中找个txt输入自己的账号密码,实现自动获取token并记录到token.txt中,定期获取token。
美国服务器 报错 {"detail":"cf-please-wait"}% 求解答
关于上传非图片的如PDF等有问题
需要check,不check 导致读取不到
{
"id": "file-CnBhx5ULQN6514xDgGeH66u9",
"name": "dmr7brxv.pdf",
"creation_time": "2024-05-01 15:38:39.956360+00:00",
"state": "ready",
"ready_time": "2024-05-01T15:38:53.435776",
"size": 1457326,
"metadata": {
"retrieval": {
"status": "success", #这个需要 success 或者 skipped
"file_size_tokens": 34812
}
},
"use_case": "my_files",
"retrieval_index_status": "success",
"file_size_tokens": 34812,
"variants": null
}
git clone 直接部署的方式
发现 访问ip5005没问题 ip5005/tokens显示 404
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.