socketteer / loom Goto Github PK
View Code? Open in Web Editor NEWMultiversal tree writing interface for human-AI collaboration
Multiversal tree writing interface for human-AI collaboration
Thank you for the practical UI, I wonder why there aren't many like this around already (paid and free, local and remote models). Probably we're way ahead of the crowd or something. It's a mystery for me lol
The issue: it seems that I cannot use gpt4 and gpt-3.5 (+turbo) with the standard settings. What works is text-davinci-003. The error I get if I choose gpt4 or 3.5 is the following:
WARNING:root:Failed with exception: Invalid URL (POST /v1/engines/gpt-3.5-turbo/chat/completions), Retrying in 1 seconds...
WARNING:root:Failed with exception: Invalid URL (POST /v1/engines/gpt-3.5-turbo/chat/completions), Retrying in 2 seconds...
WARNING:root:Failed with exception: Invalid URL (POST /v1/engines/gpt-3.5-turbo/chat/completions), Retrying in 4 seconds...
cannot unpack non-iterable NoneType object
ERROR cannot unpack non-iterable NoneType object. Deleting failures
I tried finding the issue in the code, but could not see it (no expert in their API). I you look here: https://platform.openai.com/docs/api-reference/chat/create then it seems that the URL is not the same anymore. They use https://api.openai.com/v1/chat/completions
Thanks for looking into this.
Adding a try-except here might be useful, logging that I reached this error condition, on this line.
I can't repro it, think I switched models and race-conditioned or something, but here's the traceback in case someone else has this issue:
Traceback (most recent call last):
File "/usr/lib/python3.9/tkinter/__init__.py", line 1892, in __call__
return self.func(*args)
File "/home/uk000/gh/loom/components/modules.py", line 1999, in propagate
multiverse, ground_truth, prompt = self.state.generate_greedy_multiverse(max_depth=self.max_depth.get(),
File "/home/uk000/gh/loom/model.py", line 2471, in generate_greedy_multiverse
multiverse, ground_truth = greedy_word_multiverse(prompt=prompt, ground_truth=ground_truth, max_depth=max_depth,
File "/home/uk000/gh/loom/util/multiverse_util.py", line 41, in greedy_word_multiverse
token[1]['children'], _ = greedy_word_multiverse(prompt + token[0], ground_truth='', max_depth=max_depth-1,
File "/home/uk000/gh/loom/util/multiverse_util.py", line 41, in greedy_word_multiverse
token[1]['children'], _ = greedy_word_multiverse(prompt + token[0], ground_truth='', max_depth=max_depth-1,
File "/home/uk000/gh/loom/util/multiverse_util.py", line 34, in greedy_word_multiverse
logprobs = response.choices[0]["logprobs"]["top_logprobs"][0]
IndexError: list index out of range
Attempting to delete any node fails and breaks the Loom UI. I can still click around, but the text window does not update.
Note: this is my first time using Loom.
This is the console output when a node is attempted to be deleted:
loom$ python main.py
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
opening /home/nate/loom/data/GPT_chat.json
saving tree
Exception in Tkinter callback
Traceback (most recent call last):
File "/usr/lib/python3.8/tkinter/__init__.py", line 1892, in __call__
return self.func(*args)
File "/home/nate/loom/controller.py", line 52, in <lambda>
return lambda event=None, *args, _f=f, **kwargs: _f(*args, **kwargs)
File "/home/nate/loom/util/util.py", line 355, in f
return func(*args, **kwargs)
File "/home/nate/loom/controller.py", line 684, in delete_node
self.select_node(next_sibling)
File "/home/nate/loom/util/util.py", line 355, in f
return func(*args, **kwargs)
File "/home/nate/loom/controller.py", line 430, in select_node
self.write_textbox_changes()
File "/home/nate/loom/util/util.py", line 355, in f
return func(*args, **kwargs)
File "/home/nate/loom/controller.py", line 910, in write_textbox_changes
if self.state.preferences['editable']:
File "/home/nate/loom/model.py", line 257, in preferences
return self.state['preferences']
File "/home/nate/loom/model.py", line 313, in state
frames = self.accumulate_frames(self.selected_node)
File "/home/nate/loom/model.py", line 334, in accumulate_frames
for ancestor in self.ancestry(node):
File "/home/nate/loom/model.py", line 540, in ancestry
return node_ancestry(node, self.tree_node_dict)
File "/home/nate/loom/util/util_tree.py", line 152, in node_ancestry
while "parent_id" in node:
TypeError: argument of type 'NoneType' is not iterable
Noting this here in case it helps others. With 3.10 I had issues installing pandas
.
In editing mode, at least the p
shortcut remains active, which toggles the minimap side-pane and deletes whatever you were editing. I have noticed other characters also being treated as shortcuts while editing: b
, spacebar
... Some only seem to be treated as a shortcut sometimes (within edit mode).
This makes it near impossible to write anything. Perhaps I am misinterpreting how modes work.
Hi, just trying out now your software as it looks very promising!
But as soon as I open the Jar file the application seems to be having some issues related to resolution/scale of the UI. After some time I figured out that issue is related to the Windows 11 scale setting, when I put it to 125% or higher it messes up the UI of the app.
Like so (the mouse is where de red square is and its pointing to the Close button on the Settings Window):
In order to use loom, where do I save my OpenAI key and AI21 keys?
This is epic, can't wait to enter the LoomSpace!
I'm experiencing an error where any keyboard input throws a str item assignment error - I suspect this is related to the string being immutable and needing to convert the string into a list.
Can you help me do a workaround for this?
Current Environment:
Full error:
Traceback (most recent call last):
File "/opt/homebrew/anaconda3/lib/python3.9/tkinter/__init__.py", line 1892, in __call__
return self.func(*args)
File "/Users/chiron/workspace/10Weaver/loom/components/modules.py", line 1206, in submit
self.callbacks["Submit"]["callback"](text=modified_text, auto_response=self.settings().get("auto_response", True))
File "/Users/chiron/workspace/10Weaver/loom/controller.py", line 49, in <lambda>
return lambda event=None, *args, _f=f, **kwargs: _f(*args, **kwargs)
File "/Users/chiron/workspace/10Weaver/loom/util/util.py", line 355, in f
return func(*args, **kwargs)
File "/Users/chiron/workspace/10Weaver/loom/controller.py", line 2004, in submit
new_child = self.create_child(toggle_edit=False)
File "/Users/chiron/workspace/10Weaver/loom/util/util.py", line 355, in f
return func(*args, **kwargs)
File "/Users/chiron/workspace/10Weaver/loom/controller.py", line 563, in create_child
new_child = self.state.create_child(parent=node)
File "/Users/chiron/workspace/10Weaver/loom/model.py", line 815, in create_child
self.rebuild_tree()
File "/Users/chiron/workspace/10Weaver/loom/model.py", line 36, in wrapper
output = func(self, *args, **kwargs)
File "/Users/chiron/workspace/10Weaver/loom/model.py", line 477, in rebuild_tree
self.tree_node_dict = {d["id"]: d for d in flatten_tree(self.tree_raw_data["root"])}
File "/Users/chiron/workspace/10Weaver/loom/util/util_tree.py", line 314, in flatten_tree
child["parent_id"] = d["id"]
TypeError: 'str' object does not support item assignment
Hi, I know this is dumb...but
I don't know where to input my API..
I tried EXPORT=
I can't figure it out...otherwise, looks amazing! Great work.
Also, do I need tensor installed? I notice the terminal have a warning
It would be great to implement support for the new ChatGPT endpoint (gpt-3.5-turbo).
Note that API calls are slightly different due to the back-and-forth modality: https://platform.openai.com/docs/guides/chat/chat-vs-completions
It would be interesting to be able to use loom
with open source LLMs such as GPT-Neo-X, FLAN-UL2, and LLaMA. The transformers library by Huggingface has support for almost every open source LLM through a standardized interface.
One approach to accomplish this could be direct integration. Another approach, to keep the loom
client thin, could be to develop (maybe this already exists?) a shim that adapts the OpenAI API shape to a transformers
backend
Right click on a node and select duplicate.
observed: An empty node is created as a sibling.
expected: A node with the exact same text is created as a sibling.
I'm trying to regenerate the result shown in the first image found in README, but my visualization result looks wrong.
After generating a child, I went to Info -> Node Metadata on toolbar. For the two outputs I generated, the first several tokens are totally white, while the rest of the coloring was apparently wrong since they didn’t mark the whole tokens. I had the same problem across different devices, using different language models. Could this error be fixed?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.