Giter Site home page Giter Site logo

kituin / picimagesearch Goto Github PK

View Code? Open in Web Editor NEW
405.0 8.0 46.0 1.62 MB

整合图片识别 API,用于以图搜源 / Aggregator for Reverse Image Search API

Home Page: https://pic-image-search.kituin.fun/

License: MIT License

Python 100.00%
tracemoe saucenao ascii2d iqdb google baidu e-hentai exhentai yandex

picimagesearch's Introduction

PicImageSearch

Read in other languages: English, 中文, Русский, 日本語

✨ Aggregated Image Search Engine for Reverse Image Search ✨

license pypi python release issues

📖 Documentation · 🐛 Submit an Issue

Supported Search Engines

Engine Website
ASCII2D https://ascii2d.net/
Baidu https://graph.baidu.com/
E-Hentai https://e-hentai.org/
ExHentai https://exhentai.org/
Google https://www.google.com/imghp
IQDB https://iqdb.org/
SauceNAO https://saucenao.com/
TraceMoe https://trace.moe/
Yandex https://yandex.com/images/search

Usage

For detailed information, please refer to the documentation or example code.
For synchronous usage, import using from PicImageSearch.sync import ... .
For asynchronous usage, import using from PicImageSearch import Network,... .
Asynchronous usage is recommended.

Installation

  • Requires Python 3.9 and above.
  • Installation command: pip install PicImageSearch
  • Or use the Tsinghua mirror: pip install PicImageSearch -i https://pypi.tuna.tsinghua.edu.cn/simple

Star History

Star History

picimagesearch's People

Contributors

chinoll avatar dependabot[bot] avatar eltociear avatar kituin avatar lleans avatar nachtalb avatar nekoaria avatar peloxerat avatar renovate[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

picimagesearch's Issues

Class google not work

google search doesn't work.
Your library uses https://www.google.com/searchbyimage
but that url doesn't work, you need to search google lens.

I would also like to see yandex image search in your library, searches by image better than google, consider this option ^^

examples(for url)
Image URL: https://i1.sndcdn.com/avatars-000248291936-634zqi-t500x500.jpg
Google: https://lens.google.com/uploadbyurl?url=https://i1.sndcdn.com/avatars-000248291936-634zqi-t500x500.jpg
Yandex: https://yandex.com/images/search?rpt=imageview&url=https://i1.sndcdn.com/avatars-000248291936-634zqi-t500x500.jpg
TinEye: https://tineye.com/search?url=https://i1.sndcdn.com/avatars-000248291936-634zqi-t500x500.jpg
Bing: https://www.bing.com/images/searchbyimage?cbir=sbi&imgurl=https://i1.sndcdn.com/avatars-000248291936-634zqi-t500x500.jpg

I don't know how to do it for the files, but it would be very useful

百度识图报错

hi,我在用百度识图的api,但是报错了,AttributeError: 'BaiDuResponse' object has no attribute 'same'.作者能帮忙看看么?

How to solve it?

 DeprecationWarning: There is no current event loop
  loop = asyncio.get_event_loop()
2024-03-27 19:03:58.576 | INFO     | __main__:show_result:48 - ['https://www.google.co.jp/search?tbs=sbi:AMhZZivZTycvn-TYWekgq4YSrsxQ1b1-OlgmBWA-9xu8nU2ILP8NNNktK_1fF-jkXjTMF_11HQAH7CvMUd-GhLglA8Gq6_1flXmyG7RIv1LcwFoVTn2tueoDMvNJs-SDAR_1rUwuAVLwiheoVTSWFEJDqnXsS9t2ppA24-E4nIHdoHqf3dlMmqFOKMBcMVzP70DpxDFMb9Z6AJ9WUHP6oDqsZLWCwxCl2ja2HbL5-I3IU8N_1mQAlrQ5UZdbG91KLkSmxe7y3vFDY0FIFkK3c2icuR50OycPx3hvjoomMxwgPOAkaq0Za_1C478YMmC41JBjEKG1nerBCpNJ5u']
2024-03-27 19:03:58.576 | INFO     | __main__:show_result:49 - 1
2024-03-27 19:03:58.576 | INFO     | __main__:show_result:50 - https://www.google.co.jp/search?tbs=sbi:AMhZZivZTycvn-TYWekgq4YSrsxQ1b1-OlgmBWA-9xu8nU2ILP8NNNktK_1fF-jkXjTMF_11HQAH7CvMUd-GhLglA8Gq6_1flXmyG7RIv1LcwFoVTn2tueoDMvNJs-SDAR_1rUwuAVLwiheoVTSWFEJDqnXsS9t2ppA24-E4nIHdoHqf3dlMmqFOKMBcMVzP70DpxDFMb9Z6AJ9WUHP6oDqsZLWCwxCl2ja2HbL5-I3IU8N_1mQAlrQ5UZdbG91KLkSmxe7y3vFDY0FIFkK3c2icuR50OycPx3hvjoomMxwgPOAkaq0Za_1C478YMmC41JBjEKG1nerBCpNJ5u
2024-03-27 19:03:58.576 | INFO     | __main__:show_result:51 - 1
2024-03-27 19:03:58.578 | ERROR    | asyncio.events:_run:80 - An error has been caught in function '_run', process 'MainProcess' (4620), thread 'MainThread' (21300):
Traceback (most recent call last):

  File "D:\DrissionPage-dev\try.py", line 64, in <module>
    loop.run_until_complete(t1())
    │    │                  └ <function t1 at 0x000001DBB6389900>
    │    └ <function BaseEventLoop.run_until_complete at 0x000001DBB38BD240>
    └ <ProactorEventLoop running=True closed=False debug=False>

  File "F:\dev\Python\Python3.10.4\lib\asyncio\base_events.py", line 633, in run_until_complete
    self.run_forever()
    │    └ <function ProactorEventLoop.run_forever at 0x000001DBB3993640>
    └ <ProactorEventLoop running=True closed=False debug=False>

  File "F:\dev\Python\Python3.10.4\lib\asyncio\windows_events.py", line 321, in run_forever
    super().run_forever()

  File "F:\dev\Python\Python3.10.4\lib\asyncio\base_events.py", line 600, in run_forever
    self._run_once()
    │    └ <function BaseEventLoop._run_once at 0x000001DBB38BECB0>
    └ <ProactorEventLoop running=True closed=False debug=False>

  File "F:\dev\Python\Python3.10.4\lib\asyncio\base_events.py", line 1896, in _run_once
    handle._run()
    │      └ <function Handle._run at 0x000001DBB386A830>
    └ <Handle <TaskStepMethWrapper object at 0x000001DBB64380D0>()>

> File "F:\dev\Python\Python3.10.4\lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
    │    │            │    │           │    └ <member '_args' of 'Handle' objects>
    │    │            │    │           └ <Handle <TaskStepMethWrapper object at 0x000001DBB64380D0>()>
    │    │            │    └ <member '_callback' of 'Handle' objects>
    │    │            └ <Handle <TaskStepMethWrapper object at 0x000001DBB64380D0>()>
    │    └ <member '_context' of 'Handle' objects>
    └ <Handle <TaskStepMethWrapper object at 0x000001DBB64380D0>()>

  File "D:\DrissionPage-dev\try.py", line 23, in t1
    show_result(resp)
    │           └ <PicImageSearch.model.google.GoogleResponse object at 0x000001DBB3D2D870>
    └ <function show_result at 0x000001DBB6389CF0>

  File "D:\DrissionPage-dev\try.py", line 54, in show_result
    selected = next((i for i in resp.raw if i.thumbnail), resp.raw[0])
                                │    │                    │    └ []
                                │    │                    └ <PicImageSearch.model.google.GoogleResponse object at 0x000001DBB3D2D870>
                                │    └ []
                                └ <PicImageSearch.model.google.GoogleResponse object at 0x000001DBB3D2D870>

IndexError: list index out of range

Ascii2D demo运行报错

直接用的demo,运行的时候报错
https://ascii2d.net/search/uri Traceback (most recent call last): File ".\main.py", line 45, in <module> loop.run_until_complete(test()) File "C:\Users\56393\AppData\Local\Programs\Python\Python37\lib\asyncio\base_events.py", line 584, in run_until_complete return future.result() File ".\main.py", line 18, in test show_result(resp) File ".\main.py", line 31, in show_result print(resp.raw[1].origin) IndexError: list index out of range

https://ascii2d.net/search/uri地址也是报404

另外还有一个问题是 bypass = False # 是否绕过DNS污染 这个我试图配置为bypass = True,却提示我TypeError: 'type' object is not subscriptable.不清楚是哪里出了问题

一些特殊的需求

能不能内置DNS(使用DoH来解析IP),在某些地方一些搜图网站可能会遭到DNS污染

sauceNAO接口异步上传本地文件报错

错误日志
2021-08-25 11:10:27.676 | INFO | PicImageSearch.Async.saucenao:search:114 - Unexpected type for 'content', <class 'requests_toolbelt.multipart.encoder.MultipartEncoder'>
2021-08-25 11:10:27.680 | INFO | PicImageSearch.Async.saucenao:search:114 - Unexpected type for 'content', <class 'requests_toolbelt.multipart.encoder.MultipartEncoder'>
2021-08-25 11:10:27.693 | INFO | PicImageSearch.Async.saucenao:search:114 - Unexpected type for 'content', <class 'requests_toolbelt.multipart.encoder.MultipartEncoder'>
改为url请求后通过
image
改为同步方式上传本地文件也能通过
image

百度识图的问题

作者你好,我在使用百度识图的时候会出现这个问题,请问该怎么解决
image

httpx connections timeout, probability

It works, but need some retries, and I want to find the source of it, network, or the code.
Does it always require proxie, cookies or sth?

async with Network(proxies=None, verify_ssl=True) as client:
        ascii2d = Ascii2D(
            client=client, bovw=False
        )
        resp = await ascii2d.search(file=pic)
selected = None
        for i in resp.raw:
            if i.author_url.startswith("https://twitter"):
                selected = i
                break
return selected

traceback here

Traceback (most recent call last):
  File "E:\GradioThing\venv\lib\site-packages\anyio\streams\tls.py", line 131, in _call_sslobject_method
    result = func(*args)
  File "D:\Python\lib\ssl.py", line 975, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLWantReadError: The operation did not complete (read) (_ssl.c:997)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_backends\anyio.py", line 69, in start_tls
    ssl_stream = await anyio.streams.tls.TLSStream.wrap(
  File "E:\GradioThing\venv\lib\site-packages\anyio\streams\tls.py", line 123, in wrap
    await wrapper._call_sslobject_method(ssl_object.do_handshake)
  File "E:\GradioThing\venv\lib\site-packages\anyio\streams\tls.py", line 138, in _call_sslobject_method
    data = await self.transport_stream.receive()
  File "E:\GradioThing\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 1203, in receive
    await self._protocol.read_event.wait()
  File "D:\Python\lib\asyncio\locks.py", line 214, in wait
    await fut
asyncio.exceptions.CancelledError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_exceptions.py", line 10, in map_exceptions
    yield
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_backends\anyio.py", line 78, in start_tls
    raise exc
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_backends\anyio.py", line 68, in start_tls
    with anyio.fail_after(timeout):
  File "E:\GradioThing\venv\lib\site-packages\anyio\_core\_tasks.py", line 119, in __exit__
    raise TimeoutError
TimeoutError

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "E:\GradioThing\venv\lib\site-packages\httpx\_transports\default.py", line 66, in map_httpcore_exceptions
    yield
  File "E:\GradioThing\venv\lib\site-packages\httpx\_transports\default.py", line 366, in handle_async_request
    resp = await self._pool.handle_async_request(req)
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_async\connection_pool.py", line 262, in handle_async_request
    raise exc
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_async\connection_pool.py", line 245, in handle_async_request
    response = await connection.handle_async_request(request)
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_async\http_proxy.py", line 317, in handle_async_request
    stream = await stream.start_tls(**kwargs)
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_backends\anyio.py", line 66, in start_tls
    with map_exceptions(exc_map):
  File "D:\Python\lib\contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "E:\GradioThing\venv\lib\site-packages\httpcore\_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.ConnectTimeout

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "E:\GradioThing\venv\lib\site-packages\gradio\routes.py", line 488, in run_predict
    output = await app.get_blocks().process_api(
  File "E:\GradioThing\venv\lib\site-packages\gradio\blocks.py", line 1431, in process_api
    result = await self.call_function(
  File "E:\GradioThing\venv\lib\site-packages\gradio\blocks.py", line 1101, in call_function
    prediction = await fn(*processed_input)
  File "E:\GradioThing\venv\lib\site-packages\gradio\utils.py", line 682, in async_wrapper
    response = await f(*args, **kwargs)
  File "E:\GradioThing\webui.py", line 235, in illu_getter
    resp = await ascii2d.search(file=pic)
  File "E:\GradioThing\venv\lib\site-packages\PicImageSearch\ascii2d.py", line 72, in search
    resp = await self.post(ascii2d_url, files=files)
  File "E:\GradioThing\venv\lib\site-packages\PicImageSearch\network.py", line 228, in post
    resp = await client.post(
  File "E:\GradioThing\venv\lib\site-packages\httpx\_client.py", line 1848, in post
    return await self.request(
  File "E:\GradioThing\venv\lib\site-packages\httpx\_client.py", line 1530, in request
    return await self.send(request, auth=auth, follow_redirects=follow_redirects)
  File "E:\GradioThing\venv\lib\site-packages\httpx\_client.py", line 1617, in send
    response = await self._send_handling_auth(
  File "E:\GradioThing\venv\lib\site-packages\httpx\_client.py", line 1645, in _send_handling_auth
    response = await self._send_handling_redirects(
  File "E:\GradioThing\venv\lib\site-packages\httpx\_client.py", line 1682, in _send_handling_redirects
    response = await self._send_single_request(request)
  File "E:\GradioThing\venv\lib\site-packages\httpx\_client.py", line 1719, in _send_single_request
    response = await transport.handle_async_request(request)
  File "E:\GradioThing\venv\lib\site-packages\httpx\_transports\default.py", line 365, in handle_async_request
    with map_httpcore_exceptions():
  File "D:\Python\lib\contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "E:\GradioThing\venv\lib\site-packages\httpx\_transports\default.py", line 83, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectTimeout

是否可以为 ASCII2D添加 base_url ?

类似于

google = GoogleSync(proxies=proxies, base_url=base_url)
resp = google.search(url=url)

google可以通过自定义base_url来选择镜像源,希望ascii2d也可以
目的是通过自建ascii2d反代站点到安全环境,来永久避开cf的爬虫检测,一劳永逸

Other engines

What are your thoughts on adding new, specialized search engines like Alibaba, 123rf, and Wildberries? Would it be worthwhile, or just clutter the codebase with unnecessary features? I'm trying to gauge if it makes sense to invest the development time in this idea.

Ascii2d 触发 Cloudflare

使用demo_ascii2d.py测试

测试用图片:https://i.328888.xyz/2023/03/27/iUaGFc.jpeg

PicImageSearch/ascii2d.py65行return Ascii2DResponse(resp_text, resp_url)处报错。
原因为未能获取正确的resp_text

详细信息:

resp_url: https://ascii2d.net/search/color/be319b2fa13f0d4edb33ffef1c25aaa0
使用浏览器可以打开

resp_text:

<!DOCTYPE html>
<html lang="en-US">
<head>
    <title>Just a moment...</title>
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
    <meta http-equiv="X-UA-Compatible" content="IE=Edge">
    <meta name="robots" content="noindex,nofollow">
    <meta name="viewport" content="width=device-width,initial-scale=1">
    <link href="/cdn-cgi/styles/challenges.css" rel="stylesheet">
    

</head>
<body class="no-js">
    <div class="main-wrapper" role="main">
    <div class="main-content">
        <noscript>
            <div id="challenge-error-title">
                <div class="h2">
                    <span class="icon-wrapper">
                        <div class="heading-icon warning-icon"></div>
                    </span>
                    <span id="challenge-error-text">
                        Enable JavaScript and cookies to continue
                    </span>
                </div>
            </div>
        </noscript>
        <div id="trk_jschal_js" style="display:none;background-image:url('/cdn-cgi/images/trace/managed/nojs/transparent.gif?ray=7ae5e4804b7c8d10')"></div>
        <form id="challenge-form" action="/search/color/be319b2fa13f0d4edb33ffef1c25aaa0?__cf_chl_f_tk=xXGHEBrBJltfDIQ5NV_dx9sd5Bcu6s5ZGfwRnkS3rE8-1679901854-0-gaNycGzNClA" method="POST" enctype="application/x-www-form-urlencoded">
            <input type="hidden" name="md" value="0cHzY8t_.IGgep2yExIEYzN2yOKeVXkp4fatYjZIVKU-1679901854-0-AXcDG3aw_Pd3M54LrqHB00oyL3oZL0y5qJBgWC4BjV_mFTwRofN2ZK77IUIr8PqOi6lk5CNdRSAs-hA2jWq_y0UAtrShatLy7ProqRm5a-tuvz7H1pg-9jWy9RM8c8Vm0K9h7D3IftR8uhWiZrNKkvxjBRvX5y5cWklKabrmJ5F9mO6CKEmk1iy8oEhBd3-5G_vs15dH3Z5OTeMqrCKykehrvmdwsI9MjiJZ-uP_meNziK7Do88jF3aadhZjqAgiYn5x6qkjv0a-RzNBHnRWNcYqQBaHgRb8z-EABJi01Rl7maYthj_VhailL2JNt5L36aYV0TpbLPE6DKuoQhr-dc56_tn7pZ2WCOOasvb_8MkpU7VeuX1FMGq_F-WvCrmjK-zlt9ifmkhlT6MlfHKmoHVDcKozjcH0sBnre7b1hiBBad6COAxQLnctt0el4NWbZ3jgfN49omluFR3Z9yjTk6De5DjOZyoym-my9taYVlIsZcWZ3NHVNvZgMhMJPIvmE1U_H7dC4aIEXSNNPr08Epoy66TAYEw7jFhc8KXf1ueNrHFZSyaLLm61uNeZ__GguoZUdRW7C0A1oorCKckAqmvZEFNHWCaQAhDZJr5A3j-JAcCR4izL1LTHuUDNsgNmhW6aLSqymX1sb4dXbT-a6s1VLkXFrFTeXUhWRZe3yoQMLuyK2OLYr-wLS_DOn6HAenW7rI-3bhHIG-AshfUkyIDEB9UGVyJGJ-ygJmi3nyiVs7hUvNeErj5uHrOZLk_lZY5rv206WwTxVDPQnwahDAGW1hp-f41EQEJg8DCbgbMYisLnq6WSxg8Vew0xzuMsL_QKkhT9WeMr5_TZgvQjUO44xsLqBiYr-MPvkp4lod_UKoJ1d0_LQhyKSp_3xlxBQrR43KZw8xkeYZTU2VfDDWy7hR2ThgiP6XVclHGyKowQmBbQWKWYWUe6JvzIpLAdDTzUTwIjguCJv-70kM048y9C0mtcIr5fsk-j18mobkGgvwGKAh-lDq71HkSYMMOv6NHDpE0V5sc2SRkcDovroxQYW8IJf6U4ur3Fds6tEJq7RB1lYvLo9RKHsAm7SK3qkoaP5MynxPx1te-L_gCngqhjLh6t-lwg8YDB0NJuwjVg1rCyn5OPexmGh2d3F5t0Zm9YnofAL9INf3xo9MLmeN1wfGyKZy4nJSU48VJHF4gGDBUEADbT8Ys6Xmy0H0AQa_0V_xJ1FXJWmviFM8tbnFU9kRsXFY056QhEUcVWd-t3zMnbemaDKBtnp8BBJ8SqywoHn05Kr4TTR1W0fWGKg4W9_swusOtIE2NJt4HgNjPHkLodudMsf2gUXHUhNI27V5A00L1tF4XEYuemz4p7X_adj67N4woKqAeEXwpXe3Si8N_8pmJzy4jGsalxht8F2nouAgqGgNCSrknjgz89px6ppD7QJXC6m1lfLzWkijHSwm90oegIZu0zqzi83rslD5nyAUCTRHiQqNTg1j0o2sDASgc4_TzWsm7OItWtG0r0tr6oyKBYjyhCRqiEHY8zUXfk7UX3rbIIhszA1pu9pTByHUxgQtXjboQiIXelClNvQwL7xufffOIOiwyt3kjIyiUCksGHnghe-6zufjGI3YvKdRrvMFIJCUfa2JuPwaayR4a-fAjyJO56HQvyF4iL96PSvme5_bbyxb8KCqH4VOaIMWp6OjMTLCpLv647CimapuVEKwWOAf-xzw067ZVl1Su3kVXM53zTNj0-xPfG-TFRGQ6ObN2JkwesLCu1pyGa286HWewq15m1Ht-4GK-RHXeXbAb8AS4-s3ve_wu6nhjNT_0W67ahchpJ7iVo9DcctTz6u42a77v991Ws6G3RDk__WFqy0bWuOS5HwWYZh2gkL3CjeeRlqMNSeIzYfIB5O4RTubGdLmvCiustNqMTHVvD-VpEIDuiahZJmpsMqXrxOFctRxEBSiEWh9bhgrM02SUzB8sk5Lwv1QaJQB-_0Ss2GtQ30mV81Lj26pY8M_AShUb8oKLpFRkXRBNUBpivdHPXG5eAltJj1KBQH2ujpqDmzVbHfXTlqqjczqszra2YVMmYAxiejcwySrc9Xr25311zGVI52VXWzqCH6HRvXE94kFfvG5VYuYP9-Cwr2A7SN90iAFG_oC1krcgw-KppaghvaPKrStewCZHR8kQzaD0bmMOQyKAYPPNYJwiCBDcu9ZwBtF7Zkgtz5Skvd4EbJt264fYXluzioXauSkbsC55rkSP1ADWJFcrXw5gkGL2-wQipWDnKk87CZQWdD3rL8pSPIymVRuQDvQpwX87cmMQ2_Y9kSAAyHiOwrAt4EDsjcvNf06UbBQzMxOOhEAsVPnsq7jkqyR1WDGpOcXItGDiLUNoNyjtDkoaWe-JiSxGS58IF5zy6v4qMM_2OnWxQ1pRBY0lmo_0E7fEmJ4WapzkH6XMy_AvbeIJ3WLAuCbU9cAp-aLS1wknRNrFTdV5jN_8O5sH1aRr9Niee9RJHPcx9cagUpdprJQ4xvq4H-jOYITsXTdWlROQid3zG8FGF6Yuihjz1nq3YvfEtrSKq3bpb2ScgatMAEmdvVXRKJ54">
        </form>
    </div>
</div>
<script>
    (function(){
        window._cf_chl_opt={
            cvId: '2',
            cZone: 'ascii2d.net',
            cType: 'managed',
            cNounce: '52518',
            cRay: '7ae5e4804b7c8d10',
            cHash: 'b5f04b302d5034d',
            cUPMDTk: "\/search\/color\/be319b2fa13f0d4edb33ffef1c25aaa0?__cf_chl_tk=xXGHEBrBJltfDIQ5NV_dx9sd5Bcu6s5ZGfwRnkS3rE8-1679901854-0-gaNycGzNClA",
            cFPWv: 'b',
            cTTimeMs: '1000',
            cMTimeMs: '0',
            cTplV: 5,
            cTplB: 'cf',
            cK: "",
            cRq: {
                ru: 'aHR0cHM6Ly9hc2NpaTJkLm5ldC9zZWFyY2gvY29sb3IvYmUzMTliMmZhMTNmMGQ0ZWRiMzNmZmVmMWMyNWFhYTA=',
                ra: 'Y3VybC83Ljg2LjA=',
                rm: 'R0VU',
                d: 'CIG3IwizvDdruvAjuuu8UWWxIZ7tlvkAjLk6S/KSuobcWOhe4WW78TxMkol0wF4TiE6iKIzYTc+qIx2C5ioCSq/thnk8b28fAw6RLyCV/C6zdq+Jo9MuiwMie4sX7HySYBLvZJwdD0MOL0DH4Zs+NDVKQFq5CnmFHwjyEB/6J0aM3848+2VP13VlTu3QlcDv6X7MMQo/unEZlV1BYps4G0NfQQv4toyXyVQmO9C1jnJj4OTjblSvi18JmkBxxjDLdtWw5eotcTDsePrIHQxfgjmJ8n/8ckwzzJmXFCaTxxCt3fBqsinwefjA4KYJnD7qhji5XsIIGuwExwRslQ/gFB7VwOQ04zvOQg9eO9r/AnEw6EYMSDz3hjxr8FzziD7PDFbgC1gVCHoQSVu+DGkV23BOz9lsoxaLStp7sm2V4cWkqLc8Xuu3cMNh2yH2nNGHgSX64/vbzIzuWZWcGPRh9J2ufRrN5aN8w6oHJdPzEbYk5XnC+EhkAvl79yHIByq1LiQGAD1jfuO3F9sroNGOXA52hRVtXHl4hZOhASP0brMLX/usLIiXE5FaLoXNILAgxiYY4FnEXaZp00foE+q/Jdz+txA5tcYG/jtrG5qXYIp44b3y41B7TXirfNAIss4b',
                t: 'MTY3OTkwMTg1NC43NjkwMDA=',
                m: 'iBax+xWTsLizqO37UyAVpZvPjnpPo0Lqizz0q7t1J1o=',
                i1: 'I0vYAewUJP4nsX+2unXYSQ==',
                i2: 'zspel2P6HpG54isKfDxgBA==',
                zh: 'lBxCieQpkhIHubgKnc9ER+ae9k4MjDZMrA0aIX8vY9I=',
                uh: 'GXKHFIku+R8I38kgjeIt+4x6Zn3zqdqDBGV16lmSN4k=',
                hh: 'UvL/fnJa0zOA1MTuEDgiLTU6TTQ4uCIB+JWwFzj8+nY=',
            }
        };
        var trkjs = document.createElement('img');
        trkjs.setAttribute('src', '/cdn-cgi/images/trace/managed/js/transparent.gif?ray=7ae5e4804b7c8d10');
        trkjs.setAttribute('alt', '');
        trkjs.setAttribute('style', 'display: none');
        document.body.appendChild(trkjs);
        var cpo = document.createElement('script');
        cpo.src = '/cdn-cgi/challenge-platform/h/b/orchestrate/managed/v1?ray=7ae5e4804b7c8d10';
        window._cf_chl_opt.cOgUHash = location.hash === '' && location.href.indexOf('#') !== -1 ? '#' : location.hash;
        window._cf_chl_opt.cOgUQuery = location.search === '' && location.href.slice(0, location.href.length - window._cf_chl_opt.cOgUHash.length).indexOf('?') !== -1 ? '?' : location.search;
        if (window.history && window.history.replaceState) {
            var ogU = location.pathname + window._cf_chl_opt.cOgUQuery + window._cf_chl_opt.cOgUHash;
            history.replaceState(null, null, "\/search\/color\/be319b2fa13f0d4edb33ffef1c25aaa0?__cf_chl_rt_tk=xXGHEBrBJltfDIQ5NV_dx9sd5Bcu6s5ZGfwRnkS3rE8-1679901854-0-gaNycGzNClA" + window._cf_chl_opt.cOgUHash);
            cpo.onload = function() {
                history.replaceState(null, null, ogU);
            };
        }
        document.getElementsByTagName('head')[0].appendChild(cpo);
    }());
</script>


</body>
</html>

引用的包没有

刚开始学习Python, 直接从github取下来代码很多包没有。跑不起来,能不能弄得更方便点。下载后就能跑起来。不需要复杂的配置

Tracemoe搜索的标题都是None,ascii2d的文字编码是encoding

tracemoe.search()除了图片地址还要什么参数吗,搜索的动漫标题都是None的,
还有现在res.raw[0].titile是dict了,要不要改一下文档。
ascii2d编码是不是有问题,搜索出的文字都是from_encoding的,直接被我的程序忽略掉了。
~(1IY6SS60@O %Q3WE9HL

BaiDu搜图报`KeyError: 'simi'`

如题,自己尝试请求了下,返回的其中一个结果

{
    "contsign": "XXX",
    "height": 379,
    "width": 608,
    "thumbUrl": "http://mms0.baidu.com/it/......",
    "hoverUrl": "",
    "fromUrl": "XXX",
    "objUrl": "https://graph.baidu.com/pcpage/similar?car......",
    "index": 0,
    "page": 0
},

确实没看到simi字段,百度不提供相似度了?(fromPageTitle也无了)

能否支持获取更多图片

作者你好,我在使用您的google api获取图片,目前是只能返回7个response结果么?是否可以支持获取更多的图片?

运行google搜图会报httpx问题

感谢你们的杰出贡献,但我在使用cn/demo_google.py时会报如下错误:

2024-03-27 08:57:36.493 | ERROR    | __main__:<module>:40 - An error has been caught in function '<module>', process 'MainProcess' (1576451), thread 'MainThread' (140422575613760):
Traceback (most recent call last):

  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions
    yield
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_transports/default.py", line 366, in handle_async_request
    resp = await self._pool.handle_async_request(req)
                 │    │     │                    └ <Request [b'GET']>
                 │    │     └ <function AsyncConnectionPool.handle_async_request at 0x7fb6a8b9aca0>
                 │    └ <AsyncHTTPProxy [Requests: 0 active, 0 queued | Connections: 0 active, 0 idle]>
                 └ <httpx.AsyncHTTPTransport object at 0x7fb6a875c820>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_async/connection_pool.py", line 216, in handle_async_request
    raise exc from None
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_async/connection_pool.py", line 196, in handle_async_request
    response = await connection.handle_async_request(
                     │          └ <function AsyncTunnelHTTPConnection.handle_async_request at 0x7fb6a8b9fb80>
                     └ <AsyncTunnelHTTPConnection [CONNECTION FAILED]>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_async/http_proxy.py", line 289, in handle_async_request
    connect_response = await self._connection.handle_async_request(
                             │    │           └ <function AsyncHTTPConnection.handle_async_request at 0x7fb6a8b96e50>
                             │    └ <AsyncHTTPConnection [CONNECTION FAILED]>
                             └ <AsyncTunnelHTTPConnection [CONNECTION FAILED]>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_async/connection.py", line 99, in handle_async_request
    raise exc
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_async/connection.py", line 76, in handle_async_request
    stream = await self._connect(request)
                   │    │        └ <Request [b'CONNECT']>
                   │    └ <function AsyncHTTPConnection._connect at 0x7fb6a8b96ee0>
                   └ <AsyncHTTPConnection [CONNECTION FAILED]>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_async/connection.py", line 122, in _connect
    stream = await self._network_backend.connect_tcp(**kwargs)
                   │    │                │             └ {'host': '127.0.0.1', 'port': 1081, 'local_address': None, 'timeout': 30, 'socket_options': None}
                   │    │                └ <function AutoBackend.connect_tcp at 0x7fb6a8b8a310>
                   │    └ <httpcore._backends.auto.AutoBackend object at 0x7fb6a875c970>
                   └ <AsyncHTTPConnection [CONNECTION FAILED]>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_backends/auto.py", line 30, in connect_tcp
    return await self._backend.connect_tcp(
                 │    │        └ <function AnyIOBackend.connect_tcp at 0x7fb6a8bb23a0>
                 │    └ <httpcore.AnyIOBackend object at 0x7fb6a8774eb0>
                 └ <httpcore._backends.auto.AutoBackend object at 0x7fb6a875c970>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_backends/anyio.py", line 121, in connect_tcp
    stream._raw_socket.setsockopt(*option)  # type: ignore[attr-defined] # pragma: no cover
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/contextlib.py", line 131, in __exit__
    self.gen.throw(type, value, traceback)
    │    │   │     │     │      └ <traceback object at 0x7fb6a86c6100>
    │    │   │     │     └ OSError('All connection attempts failed')
    │    │   │     └ <class 'OSError'>
    │    │   └ <method 'throw' of 'generator' objects>
    │    └ <generator object map_exceptions at 0x7fb6a8779eb0>
    └ <contextlib._GeneratorContextManager object at 0x7fb6a870c2e0>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
          └ <class 'httpcore.ConnectError'>

httpcore.ConnectError: All connection attempts failed


The above exception was the direct cause of the following exception:


Traceback (most recent call last):

> File "demo_google.py", line 40, in <module>
    test_sync()
    └ <function test_sync at 0x7fb6a8753c10>

  File "demo_google.py", line 15, in test_sync
    resp = google.search(url=url)
           │      │          └ 'https://raw.githubusercontent.com/kitUIN/PicImageSearch/main/demo/images/test03.jpg'
           │      └ <function Google.search at 0x7fb6a874fe50>
           └ <PicImageSearch.google.Google object at 0x7fb6a8751b50>

  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/PicImageSearch/sync.py", line 34, in syncified
    return coro if loop.is_running() else loop.run_until_complete(coro)
           │       │    │                 │    │                  └ <coroutine object Google.search at 0x7fb6a87b73c0>
           │       │    │                 │    └ <function BaseEventLoop.run_until_complete at 0x7fb6abf54af0>
           │       │    │                 └ <_UnixSelectorEventLoop running=False closed=False debug=False>
           │       │    └ <function BaseEventLoop.is_running at 0x7fb6abf54dc0>
           │       └ <_UnixSelectorEventLoop running=False closed=False debug=False>
           └ <coroutine object Google.search at 0x7fb6a87b73c0>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
           │      └ <method 'result' of '_asyncio.Task' objects>
           └ <Task finished name='Task-1' coro=<Google.search() done, defined at /home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/s...
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/PicImageSearch/google.py", line 95, in search
    resp = await self.get(self.base_url, params=params)
                 │    │   │    │                └ {'sbisrc': 1, 'image_url': 'https://raw.githubusercontent.com/kitUIN/PicImageSearch/main/demo/images/test03.jpg'}
                 │    │   │    └ 'https://www.google.com/searchbyimage'
                 │    │   └ <PicImageSearch.google.Google object at 0x7fb6a8751b50>
                 │    └ <function HandOver.get at 0x7fb6a874fc10>
                 └ <PicImageSearch.google.Google object at 0x7fb6a8751b50>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/PicImageSearch/network.py", line 200, in get
    resp = await client.get(url, params=params, **kwargs)
                 │      │   │           │         └ {}
                 │      │   │           └ {'sbisrc': 1, 'image_url': 'https://raw.githubusercontent.com/kitUIN/PicImageSearch/main/demo/images/test03.jpg'}
                 │      │   └ 'https://www.google.com/searchbyimage'
                 │      └ <function AsyncClient.get at 0x7fb6a8b46040>
                 └ <httpx.AsyncClient object at 0x7fb6a875c0a0>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_client.py", line 1757, in get
    return await self.request(
                 │    └ <function AsyncClient.request at 0x7fb6a8b42c10>
                 └ <httpx.AsyncClient object at 0x7fb6a875c0a0>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_client.py", line 1530, in request
    return await self.send(request, auth=auth, follow_redirects=follow_redirects)
                 │    │    │             │                      └ <httpx._client.UseClientDefault object at 0x7fb6a8fc56a0>
                 │    │    │             └ <httpx._client.UseClientDefault object at 0x7fb6a8fc56a0>
                 │    │    └ <Request('GET', 'https://www.google.com/searchbyimage?sbisrc=1&image_url=https%3A%2F%2Fraw.githubusercontent.com%2FkitUIN%2FP...
                 │    └ <function AsyncClient.send at 0x7fb6a8b42dc0>
                 └ <httpx.AsyncClient object at 0x7fb6a875c0a0>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_client.py", line 1617, in send
    response = await self._send_handling_auth(
                     │    └ <function AsyncClient._send_handling_auth at 0x7fb6a8b42e50>
                     └ <httpx.AsyncClient object at 0x7fb6a875c0a0>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_client.py", line 1645, in _send_handling_auth
    response = await self._send_handling_redirects(
                     │    └ <function AsyncClient._send_handling_redirects at 0x7fb6a8b42ee0>
                     └ <httpx.AsyncClient object at 0x7fb6a875c0a0>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_client.py", line 1682, in _send_handling_redirects
    response = await self._send_single_request(request)
                     │    │                    └ <Request('GET', 'https://www.google.com/searchbyimage?sbisrc=1&image_url=https%3A%2F%2Fraw.githubusercontent.com%2FkitUIN%2FP...
                     │    └ <function AsyncClient._send_single_request at 0x7fb6a8b42f70>
                     └ <httpx.AsyncClient object at 0x7fb6a875c0a0>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_client.py", line 1719, in _send_single_request
    response = await transport.handle_async_request(request)
                     │         │                    └ <Request('GET', 'https://www.google.com/searchbyimage?sbisrc=1&image_url=https%3A%2F%2Fraw.githubusercontent.com%2FkitUIN%2FP...
                     │         └ <function AsyncHTTPTransport.handle_async_request at 0x7fb6a8bb63a0>
                     └ <httpx.AsyncHTTPTransport object at 0x7fb6a875c820>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_transports/default.py", line 366, in handle_async_request
    resp = await self._pool.handle_async_request(req)
                 │    │     │                    └ <Request [b'GET']>
                 │    │     └ <function AsyncConnectionPool.handle_async_request at 0x7fb6a8b9aca0>
                 │    └ <AsyncHTTPProxy [Requests: 0 active, 0 queued | Connections: 0 active, 0 idle]>
                 └ <httpx.AsyncHTTPTransport object at 0x7fb6a875c820>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/contextlib.py", line 131, in __exit__
    self.gen.throw(type, value, traceback)
    │    │   │     │     │      └ <traceback object at 0x7fb6a86c6480>
    │    │   │     │     └ ConnectError(OSError('All connection attempts failed'))
    │    │   │     └ <class 'httpcore.ConnectError'>
    │    │   └ <method 'throw' of 'generator' objects>
    │    └ <generator object map_httpcore_exceptions at 0x7fb6a8799f20>
    └ <contextlib._GeneratorContextManager object at 0x7fb6a875cb20>
  File "/home/v-qinyang/miniconda3/envs/openmmlab/lib/python3.8/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
          │          └ 'All connection attempts failed'
          └ <class 'httpx.ConnectError'>

httpx.ConnectError: All connection attempts failed

但是其余的demo_x.py能够正常使用,网页也能够正常打开google网址。

httpx.ConnectError: All connection attempts failed

When I try Yandex demo, I get error message:

...
  File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
          │          └ 'All connection attempts failed'<class 'httpx.ConnectError'>

httpx.ConnectError: All connection attempts failed

httpx google error

Your google module has not been working for a long time. I changed the ports in the proxy, even removed it, but the error still appears. When requesting, it gives the same error as in #122. To launch I used https://github.com/kitUIN/PicImageSearch/blob/main/demo/en/demo_google.py.
I tried to write my own code and it worked right away.

import httpx
from bs4 import BeautifulSoup
from urllib.parse import urlencode

HEADERS = {
    "User-Agent": "Mozilla/5.0 (Linux; Android 6.0.1; SM-G920V Build/MMB29K) "\
                  "AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.98 Mobile Safari/537.36"
}

def reverse(image_url):
    try:
        # Uncomment the following lines if you want to validate the image extension
        '''
        image_extension = image_url.rsplit('.', 1)[1].lower()
        if image_extension not in ["jpg", "jpeg", "png", "gif", "bmp"]:
            return {"error": "Invalid image URL"}
        '''

        params = {"safe": "off", "sbisrc": "tg", "image_url": image_url}
        url = f"https://images.google.com/searchbyimage?{urlencode(params)}"
        
        with httpx.Client(follow_redirects=True) as client:
            response = client.get(url, headers=HEADERS)
            response.raise_for_status()

        soup = BeautifulSoup(response.text, 'html.parser')
        result = {"result_text": "", "origin": response.text}
        # for example
        output_div = soup.select_one("div.r5a77d")
        if output_div:
            decoded_text = output_div.get_text()
            
            result["result_text"] = decoded_text
        else:
            return {"error": "Failed to find text output"}

        return result
    except httpx.RequestError as error:
        return {"error": f"Failed to reverse image: {str(error)}"}

# For testing:
if __name__ == "__main__":
    test_url = "https://raw.githubusercontent.com/kitUIN/PicImageSearch/main/demo/images/test03.jpg"
    result = reverse(test_url)
    print(result)

关于tracemoe的一个小问题

tracemoe.py里_get_anime_info获取到的返回信息感觉已经包括了api.trace.moe/search?anilistInfo请求的信息,是否可以只保留一个?(因为trace.moe那边的文档也说anilistInfo=true本质上就是往Anilist API多发一次请求,所以感觉发送两个不是很必要,_get_anime_info里定义的query已经远比trace.moe发送的要详细很多了)

(另外_get_anime_info中的client是不是可以考虑和TraceMoe主类共用一个,这样就没必要每次都开一个新的了

Ascii2d need selected.websource

I would like to have a built-in string to display what social media site the item comes from.
Example: selected.websource should display 'Twitter' or 'pixiv'

rework readme

Once upon a time I found your library when it first started, back then there was only Chinese translate, and because of this it was quite difficult to figure out what was responsible for what

I made three translations of the readme file and demo files. I don't know Chinese so double check
PicImageSearch-main_extended.zip

关于百度api的建议

代码中出现的问题是,程序没有办法判断输入的内容为文件url还是文件流,导致所有的查询都是在判断为url的情况下进行查询,这也就导致了程序只能查询url图片。
在我借助作者大大所建立的类事,我发现,此类依旧可用:

import requests
import json
import sys
import re
from typing import Any, Dict, List, Optional

class BaiDuItem:
    def __init__(self, data: Dict[str, Any]):
        self.origin: Dict[str, Any] = data  # 原始数据
        self.page_title: str = data["fromPageTitle"]  # 页面标题
        self.title: str = data["title"][0]  # 标题
        self.abstract: str = data["abstract"]  # 说明文字
        self.image_src: str = data["image_src"]  # 图片地址
        self.url: str = data["url"]  # 图片所在网页地址
        self.img_list: List[str] = data.get("imgList", [])  # 其他图片地址列表


class BaiDuResponse:
    def __init__(self, resp_text: str, resp_url: str):
        self.url: str = resp_url  # 搜索结果地址
        self.similar: List[Dict[str, Any]] = []  # 相似结果返回值
        self.raw: List[BaiDuItem] = []  # 来源结果返回值
        # 原始数据
        self.origin: List[Dict[str, Any]] = json.loads(
            re.search(r"cardData = (.+);window\.commonData", resp_text)[1]  # type: ignore
        )
        self.same: Optional[Dict[str, Any]] = {}
        for i in self.origin:
            setattr(self, i["cardName"], i)
        if self.same:
            self.raw = [BaiDuItem(x) for x in self.same["tplData"]["list"]]
            info = self.same["extData"]["showInfo"]
            del info["other_info"]
            for y in info:
                for z in info[y]:
                    try:
                        self.similar[info[y].index(z)][y] = z
                    except IndexError:
                        self.similar.append({y: z})

        self.item: List[str] = [
            attr
            for attr in dir(self)
            if not callable(getattr(self, attr))
            and not attr.startswith(("__", "origin", "raw", "same", "url"))
        ]


baidu_url = "https://graph.baidu.com:443/upload?tn=pc&from=pc&image_source=PC_UPLOAD_IMAGE_FILE&"

request_file = {
    "image":('2.jpg',open(r'2.jpg','rb'))
}

aa = requests.post(baidu_url, files=request_file)

## 获取百度搜图解析网站
bb = json.loads(str(aa.text))
url = bb['data']['url']

cc = requests.post(url)
dd = BaiDuResponse(cc.text,url)

print(dd.similar)

希望对作者的改进有帮助

小白请教这个怎么用?

小白请教这个怎么用啊?是下载插件还是在线用的,点文档进去开始使用也不是使用界面,下载ZIP下来也不能被谷歌插件或油猴读取安装插件

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.