Giter Site home page Giter Site logo

ollama-ui / ollama-ui Goto Github PK

View Code? Open in Web Editor NEW
611.0 9.0 100.0 330 KB

Simple HTML UI for Ollama

Home Page: https://ollama-ui.github.io/ollama-ui/

License: MIT License

JavaScript 61.75% CSS 6.27% HTML 22.17% Shell 5.54% Makefile 3.94% Dockerfile 0.33%

ollama-ui's Introduction

ollama-ui's People

Contributors

bastos avatar divnyi avatar gmelsby avatar hmottestad avatar lmlsna avatar oemmerson avatar ollama-ui avatar rtcfirefly avatar tarqd avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ollama-ui's Issues

`make` fails to make

Inside of the ollama-ui directory:

$ make
# Check if resources directory exists, if not create it
# Check SHA-256 hash
: No such file or directory.bundle.min.js
: FAILED open or readundle.min.js
: No such file or directory.min.css
: FAILED open or readin.css
: No such file or directoryn.js
: FAILED open or readjs
: No such file or directoryn.js
: FAILED open or readjs
shasum: WARNING: 4 listed files could not be read
make: *** [Makefile:20: download_resources] Error 1

This is using Windows 10

Request to change license to BSD

Hi,

Since this repo seems to be very inactive, can I request for a change of license to BSD so that we can continue it on forked version, so that it will allow others to develop from its base without the need to reference its author(s) for the forked projects or modifications thereof?

Thank you.

TypeError: Failed to fetch and Access to fetch has been blocked

In one terminal, I have run:

> make download_resources
# Check if resources directory exists, if not create it
# Check SHA-256 hash
resources/bootstrap.bundle.min.js: OK
resources/bootstrap.min.css: OK
resources/marked.min.js: OK
resources/purify.min.js: OK
> ls resources                                                                                                                                   ok | 3.11.5 py | at 03:33:24 PM
bootstrap.bundle.min.js bootstrap.min.css       marked.min.js           purify.min.js
> ./ollama serve
2023/09/16 15:22:34 routes.go:536: Listening on 127.0.0.1:11434
...

In another terminal:

> make web_server
python3 -m http.server
Serving HTTP on :: port 8000 (http://[::]:8000/) ...
::ffff:127.0.0.1 - - [16/Sep/2023 15:32:15] "GET / HTTP/1.1" 304 -
::ffff:127.0.0.1 - - [16/Sep/2023 15:32:16] code 404, message File not found
::ffff:127.0.0.1 - - [16/Sep/2023 15:32:16] "GET /resources/bootstrap.bundle.min.js.map HTTP/1.1" 404 -
::ffff:127.0.0.1 - - [16/Sep/2023 15:32:16] code 404, message File not found
::ffff:127.0.0.1 - - [16/Sep/2023 15:32:16] "GET /resources/purify.min.js.map HTTP/1.1" 404 -
::ffff:127.0.0.1 - - [16/Sep/2023 15:32:16] code 404, message File not found
::ffff:127.0.0.1 - - [16/Sep/2023 15:32:16] "GET /resources/bootstrap.min.css.map HTTP/1.1" 404 -

The UI isn't working, there is no response. Opening Chrome's developer tools:

screenshot of Chrome dev tools

Any idea what's going on here?

[FR] ๐ŸŒ Add support for EXPORT/IMPORT chats

It would be great adding some kind of export/import chats feature. Given the (current) serverless approach, we cannot share-chats-by-link in a classical way. So having an export/import format would be great. For example, if we use json we could have something like this:

{
 "version": 1, 
 "hostname": "localhost",
 "port": 11434, 
 "model": "llama2:70b",
 "name": "Working with LLMs", 
 "messages": {
   "q": "Hi this is Mike, how are you?",
   "a": "Hello Mike! I'm doing well, thanks for asking. It's nice to meet you. Is there anything you'd like to chat about or ask me? I'm here to help with any questions you might have."
   "q": "Tell me about LLMs"
  }
}

Default persistent history

I'd like to be able to keep chat history with ollama-ui persistent after restarts, by default (avoid needing to "save"). Are there instructions or resources that show how to do this, if possible? If not, it would be a nice feature.

Feature: Make ollama-ui usable from other devices

Hello,

I have been working on configuring **ollama-ui** to be accessible over my local LAN, with the device running it functioning as a server. I've managed to do this successfully by modifying the Makefile to allow Ollama to listen on 0.0.0.0 and specifying the **OLLAMA_ORIGINS** to the FQDN of the server as shown below:

OLLAMA_HOST="0.0.0.0:11434" OLLAMA_ORIGINS=http://192.168.1.44:8000 ollama serve

Issues:

However, this approach required altering the visibility of the host-address-select div by removing display: none;, enabling me to set my server IP: 192.168.1.44.

<div id="host-address-select" style="display: none;">
     <div class="d-flex align-items-center mb-2">
       <label for="host-address" class="form-label me-2" style="font-size: smaller;">Host:</label>
       <input id="host-address" class="form-control" type="text" placeholder="http://localhost:11434" style="width: auto;"></select>
      </div>
</div>

So I could set my server IP : 192.168.1.44.

Proposed Enhancement:

To refine this and make the configuration more user-friendly, I propose the following enhancements:

- Configuration UI Element:
Introduce a settings icon or equivalent UI element.
When clicked, this would reveal the input field allowing users to modify the IP address.

- Placeholder Text:
Update the placeholder text in the input field to contain an IP address example instead of a URI, providing clearer guidance to users setting up their configurations.

- Readme Documentation:
Enrich the README with clear instructions on how to adjust the Makefile and other settings to make ollama-ui accessible over LAN, providing a smoother experience for users unfamiliar with these configurations.

Unobfuscated/minified version of the /resources directory?

I've been using this UI and I'm a big fan of it, but I've noticed that you've minified the resources of the webpage. I'd love for the unminified versions of these files so I can properly make changes and edit the code without having to unminify it and try and edit it that way.

Add the ability to attach image and multimodal prompt for llava

Hi, I like this repo so much because it is very simplified and straightforward, however, it is very much behind from the latest ollama developments.

I'd like to request a new feature for multimodal support:

1.) The ability to attach file (to ask about)
2.) The ability to attach document (to interact against)

Thank you very much.

Stop model daemon

Dummy question but, how can I just stop LLM model ? With Mistral, if I launched it with ollama run mistral in terminal I can simply type /exit, but with your script I don't know how ? Thanks !

Interaction with models

Hi, and thanks for your project.
It works flawlessly on my machine.
I am not a dev, I am a noob; I have been trying to create a GUI for Ollama beforediscovering your project.
I forked it and added a feature, a web UI to interact with model.

It is a raw attempt, but I would appreciate if you let me know what do you think about.

https://github.com/fatualux/ollama-ui

extra prefix

does [/INST] prefix during response can be confided by any place?
Screenshot 2023-12-05 at 11 05 41

Bug: GitHub Pages Hosted version doesn't work due to HTTPS violation

Problem

net::ERR_BLOCKED_BY_CLIENT

Screenshot 2023-10-21 at 11 40 54 AM Screenshot 2023-10-21 at 11 41 47 AM

The site at https://ollama-ui.github.io/ollama-ui/ makes a request to http://localhost:11434/api/tags which are blocked BY THE CLIENT (not the server), for a few reasons:

The request needs to be HTTPS

The hosted version should request https://localhost:11443, NOT http://localhost:11434.

The localhost server should be valid HTTPS

This can be solve easily with certificates that are recognized by all web browsers by using caddy

  1. Install caddy
    # Mac, Linux
    curl https://webi.sh/caddy | sh
    
    # Windows
    curl.exe https://webi.ms/caddy | powershell
  2. Create a Reverse Proxy Caddyfile
    ./Caddyfile:
    https://localhost:11443 {
        reverse_proxy localhost:11434
    }
  3. Run caddy
    caddy run --config ./Caddyfile

The Origin header is omitted for localhost

Running the ollama server with OLLAMA_ORIGINS='https://github.com/ollama-ui/ollama-ui' won't work because the Origin header isn't sent. Instead the wildcard must be used.

OLLAMA_HOST=0.0.0.0:11434 OLLAMA_ORIGINS='*' ollama serve

Add the ability of saving the conversation in a file and loading it

I've been enjoying using your Chrome extension and find it to be extremely useful. I wanted to take a moment to propose a feature that I believe would significantly enhance the user experience.

Feature Request:
Would you consider adding a feature that enables users to save their conversations to a local file? Additionally, it would be beneficial to have the option to load these saved conversations at a later time.

Implementation Suggestions:

  • A "Start Recording" button could initiate the process of saving conversations to a local file. Every time the Ollama module responds, we can concatenate the new content to the file.
  • Users could select which saved conversations to reload via a dropdown menu or list within the extension's interface.

I think this feature offer a convenient way to revisit past conversations.

Thank you in advance

HTML page giving error : `Unable to access Ollama server`

I have entered the right path of ollama API 0.0.0.0:11434


Ollama-ui was unable to communitcate with Ollama due to the following error:

Unexpected token '<', "<!DOCTYPE "... is not valid JSON

How can I expose the Ollama server?

By default, Ollama allows cross origin requests from 127.0.0.1 and 0.0.0.0.

To support more origins, you can use the OLLAMA_ORIGINS environment variable:

OLLAMA_ORIGINS=http://127.0.0.1:8000 ollama serve
Also see: https://github.com/jmorganca/ollama/blob/main/docs/faq.md

Screenshot from 2024-01-15 13-05-07

Screenshot from 2024-01-15 13-07-11

https://hjlabs.in

Request: adding `favicon.ico`

When running python3 -m http.server and going to http://localhost:8000/

Serving HTTP on :: port 8000 (http://[::]:8000/) ...
::1 - - [17/Sep/2023 11:14:47] "GET / HTTP/1.1" 200 -
::1 - - [17/Sep/2023 11:14:47] "GET /resources/bootstrap.min.css HTTP/1.1" 200 -
::1 - - [17/Sep/2023 11:14:47] "GET /resources/bootstrap.bundle.min.js HTTP/1.1" 200 -
::1 - - [17/Sep/2023 11:14:47] "GET /chat.css HTTP/1.1" 200 -
::1 - - [17/Sep/2023 11:14:47] "GET /resources/purify.min.js HTTP/1.1" 200 -
::1 - - [17/Sep/2023 11:14:47] "GET /resources/marked.min.js HTTP/1.1" 200 -
::1 - - [17/Sep/2023 11:14:47] "GET /api.js HTTP/1.1" 200 -
::1 - - [17/Sep/2023 11:14:47] "GET /chat.js HTTP/1.1" 200 -
::1 - - [17/Sep/2023 11:14:47] code 404, message File not found
::1 - - [17/Sep/2023 11:14:47] "GET /favicon.ico HTTP/1.1" 404 -

Can we add a favicon here, so this favicon.ico error doesn't show up?

Perhaps the Ollama icon can be used:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.