Giter Site home page Giter Site logo

langchain-visualizer's People

Contributors

amosjyng avatar chrisza4 avatar kyohei3 avatar philipesteiff avatar reddingit avatar sam-cohan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

langchain-visualizer's Issues

Error when importing library

Traceback (most recent call last):
File "/mnt/g/Other computers/My Computer/*****/llm/tracing.py", line 1, in
import langchain_visualizer
File "/home/alex/miniconda3/envs/py39/lib/python3.9/site-packages/langchain_visualizer/init.py", line 8, in
from .agents.tools import SerpAPIWrapper # noqa
File "/home/alex/miniconda3/envs/py39/lib/python3.9/site-packages/langchain_visualizer/agents/tools.py", line 11, in
from langchain.utilities.bash import BashProcess
ModuleNotFoundError: No module named 'langchain.utilities.bash'

can't run visualizer when agent type is `OPENAI_FUNCTIONS` or multi functions

~/.pyenv/versions/3.11.4/lib/python3.11/inspect.py:3201 in _bind                       │
│                                                                                                  │
│   3198 │   │   │   │   # Process our '**kwargs'-like parameter                                   │
│   3199 │   │   │   │   arguments[kwargs_param.name] = kwargs                                     │
│   3200 │   │   │   else:                                                                         │
│ ❱ 3201 │   │   │   │   raise TypeError(                                                          │
│   3202 │   │   │   │   │   'got an unexpected keyword argument {arg!r}'.format(                  │
│   3203 │   │   │   │   │   │   arg=next(iter(kwargs))))                                          │
│   3204                                                                                           │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
TypeError: got an unexpected keyword argument 'functions'

RuntimeError: There is no current event loop in thread 'Thread-7 (process_request_thread)'.

I use the visualiser for specific cases to debug flows but for usage of my API, I don't want the visualisation. However, simply importing the langchain_visualizer causes an error during agent.run (presumably because it's not wrapped in visualize).

  File "/home/me/emu/org/my-api/.venv/lib/python3.10/site-packages/langchain/chains/base.py", line 290, in run
    return self(args[0], callbacks=callbacks, tags=tags)[_output_key]
  File "/home/me/emu/org/my-api/.venv/lib/python3.10/site-packages/langchain_visualizer/hijacking.py", line 74, in overridden_call
    return asyncio.get_event_loop().run_until_complete(
  File "/home/me/emu/org/my-api/.venv/lib/python3.10/site-packages/nest_asyncio.py", line 45, in _get_event_loop
    loop = events.get_event_loop_policy().get_event_loop()
  File "/usr/lib/python3.10/asyncio/events.py", line 656, in get_event_loop
    raise RuntimeError('There is no current event loop in thread %r.'
RuntimeError: There is no current event loop in thread 'Thread-7 (process_request_thread)'.

Async calls do not show up as expected

Hi, thank for this excellent project. I noticed that in llms/base.py you only hijack the generate not the agenerate for the BaseLLM. Why not also hijack agenerate?

Without hijacking agenerate as well, output missing the llm call:
Screenshot 2023-06-17 at 11 10 36 AM

I created a simple PR which fixes this to look somewhat more reasonable, but still the final output never shows up (not sure why)
Screenshot 2023-06-17 at 11 26 12 AM

If I change all my calls to be synchronous, then everything looks much nicer with prices showing up too:
Screenshot 2023-06-17 at 11 27 58 AM

Would appreciate it if you look at the PR and let me know why the prices are not working and also why the final result does not appear to show up.

Thanks again for your work.

Request: supporting Pydantic v2

The request is, can langchain-visualizer support Pydantic v2? As of langchain-visualizer==0.0.33, Pydantic v2 is unsupported:

> poetry add langchain-visualizer
Using version ^0.0.33 for langchain-visualizer

...

Because langchain-visualizer (0.0.33) depends on pydantic (>=1.0.0,<2.0.0)
 and no versions of langchain-visualizer match >0.0.33,<0.0.34, langchain-visualizer (>=0.0.33,<0.0.34) requires pydantic (>=1.0.0,<2.0.0).
So, because myproject depends on both pydantic (>=2) and langchain-visualizer (^0.0.33), version solving failed.

#76 is related

Jupyter proxy support and/or rendering inside a managed Jupyter notebook

So a very common use case is to try to make use of this tool in a SageMaker notebook environment. As SageMaker notebooks are managed instances, the only way to access the ports is through the jupyter proxy (https://<notebook_url>/proxy/8935/). This issue is probably more related to the Ought/ICE code, but I am noticing that it does not work through the proxy: first of all when the page is trying to be loaded the index.html file tries to load assets from https://<notebook_url>/assets/ instead of (https://<notebook_path>/proxy/8935/assets) but even after hard-coding that path and making sure the assets are served, there are internal server errors... I wonder if there is a good way to either fix the proxy compatibility with some sort of patch or better still, force the server visualization to show up in a notebook iFrame instead of a new tab... I know this is probably outside the scope of this project, but it could be an important feature for adoption by many companies that use SageMaker for development purposes.
In the absence of such a solution, may have to consider running the server on a separate machine and making sure the messages are routed correctly there, but it is not clear to me that would work since the visualization may require access to the cache and other files generated?

Not working with latest version of langchain==0.0.158

Seems that langchain refactored the callbacks in langchain-ai/langchain@d3ec00b

Hot fix is to remove the "callbacks" argument from kwargs in hijacking.py file:

def get_overridden_call(viz_cls, og_method_name):
    def overridden_call(og_self, *args, **kwargs):
        """Preserve sync nature of OG call method"""
        ice_agent = get_viz_wrapper(viz_cls, og_self, og_method_name)
        if (
            not hasattr(og_self.__class__, "_should_trace")
            or og_self.__class__._should_trace
        ):
            # ICE displays class name in visualization
            ice_agent.__class__.__name__ = og_self.__class__.__name__
            if "callbacks" in kwargs.keys():
                del kwargs["callbacks"]

            return asyncio.get_event_loop().run_until_complete(
                ice_agent.run(*args, **kwargs)
            )

        return ice_agent.og_fn(og_self, *args, **kwargs)

    return overridden_call

Support for langchain >= 0.1?

Hey - any thoughts on this? Saw the version of langchain limited here. Or is the intention that we should find alternate solutions if we use langchain > 0.1?

Demo looks awesome and would love to try it in our project!

Error installing

pip install langchain-visualizer leads no distribution match. Im currently using python 3.11.0


ERROR: Ignored the following versions that require a different python version: 0.0.1 Requires-Python >=3.10,<3.11; 0.0.10 Requires-Python >=3.10,<3.11; 0.0.2 Requires-Python >=3.10,<3.11; 0.0.3 Requires-Python >=3.10,<3.11; 0.0.4 Requires-Python >=3.10,<3.11; 0.0.5 Requires-Python >=3.10,<3.11; 0.0.6 Requires-Python >=3.10,<3.11; 0.0.7 Requires-Python >=3.10,<3.11; 0.0.8 Requires-Python >=3.10,<3.11; 0.0.9 Requires-Python >=3.10,<3.11
ERROR: Could not find a version that satisfies the requirement langchain-visualizer (from versions: none)
ERROR: No matching distribution found for langchain-visualizer

time cost

What an amazing tool!!!

please consider adding time cost

Is this package supported in Jupyter notebooks?

I didn't see any examples for Jupyter notebooks. Also, I got an error trying to use this package in a Jupyter notebook:

usage: ipykernel_launcher.py [-h]
                             [--mode {human,augmented,augmented-cached,machine,fake,test,approval,machine-cached}]
                             [--trace | --no-trace] [--version]
ipykernel_launcher.py: error: unrecognized arguments: --ip=127.0.0.1 --stdin=9003 --control=9001 --hb=9000 --Session.signature_scheme="hmac-sha256" --Session.key=b"bbc7f181-2147-4e04-9da0-133c6a413dab" --shell=9002 --transport="tcp" --iopub=9004 --f=<path-to-log-json>

So I wanted to ask if the Jupyter notebooks are supported or not? Thanks!

How to render the visualization result in jupyter notebook?

Hi, I use langchain in a docker container, with port 8359 mapping to host port 1233, and 8888 to host port 1231. But when I run the jupyter notebook demo, the result couldn't be rendered.

langchain 0.0.239 and langchain-visualizer 0.0.29 are used.

Could anyone tell me how can I get the rendered result?

from langchain_visualizer.jupyter import visualize
visualize(demo, width=1000, height=500)

2023-09-07 04:29.54.280017 [info     ] Trace: http://127.0.0.1:8935/traces/01H9PYEQQ6J0HVA4QVMCAG32BT
2023-09-07 04:29.54.431622 [info     ] Starting server, set OUGHT_ICE_AUTO_SERVER=0 to disable.
2023-09-07 04:29.55.248568 [info     ] Server started! Run `python -m ice.server stop` to stop it.
Rendering http://127.0.0.1:8935/traces/01H9PYEQQ6J0HVA4QVMCAG32BT in notebook

无标题

libyaml-dev requirement is not documented / automated

Bug

Installing langchain-visualizer with Pip was not enough for the library to install successfully on my system.

When attempting to import, it would fail on the following command, invoked by the ice library

from yaml import CLoader

No CLoader existed in that environment.

Environment

My project uses a Jupyter server running inside a Docker container, based on an Ubuntu 22 Jammy image. I've been installing the python libs with a requirements.txt passed to pip inside the Dockerfile.

Manual Solution

Here is what worked for me.

  1. Invoke apt in the Dockerfile to install libyaml-dev
    RUN apt update && apt install -y libyaml-dev
    
  2. Add the following two lines to requirements.txt (passed to pip with -r):
    --global-option='--with-libyaml'
    langchain-visualizer
    

At a minimum, this should be documented in the README.md.

Informative Error Message

The import statement could catch the import error and more helpfully inform the user that libyaml-dev is required.

Automatic Solution

Is there a way to require C/C++ libs in Python's package managers? I have no idea.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.