Comments (21)
WIN11 WSL2 Ubuntu 22.04
pip install git+https://github.com/PotatoSpudowski/fastLLaMa.git@feature/pip
Defaulting to user installation because normal site-packages is not writeable
Collecting git+https://github.com/PotatoSpudowski/fastLLaMa.git@feature/pip
Cloning https://github.com/PotatoSpudowski/fastLLaMa.git (to revision feature/pip) to /tmp/pip-req-build-5bptunmj
Running command git clone --filter=blob:none --quiet https://github.com/PotatoSpudowski/fastLLaMa.git /tmp/pip-req-build-5bptunmj
Running command git checkout -b feature/pip --track origin/feature/pip
Switched to a new branch 'feature/pip'
Branch 'feature/pip' set up to track remote branch 'feature/pip' from 'origin'.
Resolved https://github.com/PotatoSpudowski/fastLLaMa.git to commit 768a34747213d26ea4550b4f7d9d1ebea0c88a11
Preparing metadata (setup.py) ... done
Requirement already satisfied: numpy>=1.24.2 in /usr/local/lib/python3.10/dist-packages (from fastllama==0.5) (1.24.2)
Requirement already satisfied: py-cpuinfo>=9.0.0 in /usr/local/lib/python3.10/dist-packages (from fastllama==0.5) (9.0.0)
Requirement already satisfied: inquirer>=3.1.3 in /usr/local/lib/python3.10/dist-packages (from fastllama==0.5) (3.1.3)
Requirement already satisfied: cmake>=3.20.2 in /usr/local/lib/python3.10/dist-packages (from fastllama==0.5) (3.26.3)
Requirement already satisfied: blessed>=1.19.0 in /usr/local/lib/python3.10/dist-packages (from inquirer>=3.1.3->fastllama==0.5) (1.20.0)
Requirement already satisfied: python-editor>=1.0.4 in /usr/local/lib/python3.10/dist-packages (from inquirer>=3.1.3->fastllama==0.5) (1.0.4)
Requirement already satisfied: readchar>=3.0.6 in /usr/local/lib/python3.10/dist-packages (from inquirer>=3.1.3->fastllama==0.5) (4.0.5)
Requirement already satisfied: wcwidth>=0.1.4 in /usr/local/lib/python3.10/dist-packages (from blessed>=1.19.0->inquirer>=3.1.3->fastllama==0.5) (0.2.6)
Requirement already satisfied: six>=1.9.0 in /usr/lib/python3/dist-packages (from blessed>=1.19.0->inquirer>=3.1.3->fastllama==0.5) (1.16.0)
Requirement already satisfied: setuptools>=41.0 in /home/v22/.local/lib/python3.10/site-packages (from readchar>=3.0.6->inquirer>=3.1.3->fastllama==0.5) (67.7.2)
Building wheels for collected packages: fastllama
Building wheel for fastllama (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [62 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build/lib
creating build/lib/fastLLaMa
copying fastLLaMa/__init__.py -> build/lib/fastLLaMa
copying fastLLaMa/api.py -> build/lib/fastLLaMa
/home/v22/.local/lib/python3.10/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated.
!!
********************************************************************************
Please avoid running ``setup.py`` directly.
Instead, use pypa/build, pypa/installer, pypa/build or
other standards-based tools.
See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details.
********************************************************************************
!!
self.initialize_options()
installing to build/bdist.linux-x86_64/wheel
running install
creating fastllama.egg-info
writing fastllama.egg-info/PKG-INFO
writing dependency_links to fastllama.egg-info/dependency_links.txt
writing requirements to fastllama.egg-info/requires.txt
writing top-level names to fastllama.egg-info/top_level.txt
writing manifest file 'fastllama.egg-info/SOURCES.txt'
reading manifest file 'fastllama.egg-info/SOURCES.txt'
adding license file 'LICENSE'
writing manifest file 'fastllama.egg-info/SOURCES.txt'
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "/tmp/pip-req-build-5bptunmj/setup.py", line 55, in <module>
setup(
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/__init__.py", line 107, in setup
return distutils.core.setup(**attrs)
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 185, in setup
return run_commands(dist)
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
dist.run_commands()
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
self.run_command(cmd)
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/dist.py", line 1244, in run_command
super().run_command(command)
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/wheel/bdist_wheel.py", line 335, in run
self.run_command('install')
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/dist.py", line 1244, in run_command
super().run_command(command)
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/tmp/pip-req-build-5bptunmj/setup.py", line 28, in run
for ext in self.extensions:
File "/home/v22/.local/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 107, in __getattr__
raise AttributeError(attr)
AttributeError: extensions
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for fastllama
Running setup.py clean for fastllama
Failed to build fastllama
ERROR: Could not build wheels for fastllama, which is required to install pyproject.toml-based projects
from fastllama.
@PotatoSpudowski Just ran pip install on the feature/pip
branch and confirming it installs now, thanks!
from fastllama.
Ah, I found the issue, deleting directory examples/python/fastllama/
made it work.
from fastllama.
I have added initial support for pip in feature/pip
branch
To install fastLLaMa using pip run
pip install git+https://github.com/PotatoSpudowski/fastLLaMa.git@feature/pip
Importing should look like
from fastLLaMa import Model
If anyone can test it out and let me know if anything is breaking, I would appreciate it!
Based on the machine you are using, pip will install it with the appropriate flags.
This way, I don't have to worry about managing packages for different hardware for now!
from fastllama.
@amitsingh19975 @stduhpf Not urgent, if you guys get some time can you please test out if the pip install builds everything on your local machines? I have tested it out on Apple Silicon Mac and Intel CPU with avx2. Seems to be working as expected for me.
from fastllama.
Getting similar error:
for ext in self.extensions:
^^^^^^^^^^^
from fastllama.
@matthoffner @vootshiclone Thank you for raising the issue.
I have updated the the setup script. Can you please try checking it now?
from fastllama.
@PotatoSpudowski successfully installed with sudo, importing doesn't work though.
from fastllama.
Failed on win10:
PS C:\Users\dex> pip install git+https://github.com/PotatoSpudowski/fastLLaMa.git@feature/pip
Collecting git+https://github.com/PotatoSpudowski/fastLLaMa.git@feature/pip
Cloning https://github.com/PotatoSpudowski/fastLLaMa.git (to revision feature/pip) to c:\users\dex\appdata\local\temp\pip-req-build-gz4g0lcm
Running command git clone --filter=blob:none --quiet https://github.com/PotatoSpudowski/fastLLaMa.git 'C:\Users\dex\AppData\Local\Temp\pip-req-build-gz4g0lcm'
WARNING: Did not find branch or tag 'feature/pip', assuming revision or ref.
Running command git checkout -q feature/pip
error: pathspec 'feature/pip' did not match any file(s) known to git
error: subprocess-exited-with-error
× git checkout -q feature/pip did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× git checkout -q feature/pip did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
from fastllama.
@Diyago can you try using wsl?
@vootshiclone sorry have been doing a lot of changes on that branch. we merged it to main, can you test now?
from fastllama.
PS import is now a lil different too
from fastllama import Model
from fastllama.
@PotatoSpudowski install worked only with this:
sudo pip install git+https://github.com/PotatoSpudowski/fastLLaMa
Import works👍
Trying to run the cell:
from fastllama import Model
MODEL_PATH = "./models/ggml-vic7b-q4_0.bin"
model = Model(
path=MODEL_PATH, #path to model
num_threads=14, #number of threads to use
n_ctx=512, #context size of model
last_n_size=64, #size of last n tokens (used for repetition penalty) (Optional)
seed=0, #seed for random number generator (Optional)
)
Here is the output:
PermissionError Traceback (most recent call last)
Cell In[1], line 1
----> 1 from fastllama import Model
2 MODEL_PATH = "./models/ggml-vic7b-q4_0.bin"
4 model = Model(
5 path=MODEL_PATH, #path to model
6 num_threads=14, #number of threads to use
(...)
9 seed=0, #seed for random number generator (Optional)
10 )
File /usr/local/lib/python3.10/dist-packages/fastllama/init.py:2
1 import os;
----> 2 from .api import *
4 set_library_path(os.path.dirname(os.path.abspath(file)))
File :1027, in find_and_load(name, import)
File :1006, in find_and_load_unlocked(name, import)
File :688, in _load_unlocked(spec)
File :879, in exec_module(self, module)
File :1016, in get_code(self, fullname)
File :1073, in get_data(self, path)
PermissionError: [Errno 13] Permission denied: '/usr/local/lib/python3.10/dist-packages/fastllama/api.py'
from fastllama.
@PotatoSpudowski install worked only with this:
sudo pip install git+https://github.com/PotatoSpudowski/fastLLaMa
Import works👍
Trying to run the cell:
from fastllama import Model MODEL_PATH = "./models/ggml-vic7b-q4_0.bin" model = Model( path=MODEL_PATH, #path to model num_threads=14, #number of threads to use n_ctx=512, #context size of model last_n_size=64, #size of last n tokens (used for repetition penalty) (Optional) seed=0, #seed for random number generator (Optional) )
Here is the output:
PermissionError Traceback (most recent call last) Cell In[1], line 1 ----> 1 from fastllama import Model 2 MODEL_PATH = "./models/ggml-vic7b-q4_0.bin" 4 model = Model( 5 path=MODEL_PATH, #path to model 6 num_threads=14, #number of threads to use (...) 9 seed=0, #seed for random number generator (Optional) 10 )
File /usr/local/lib/python3.10/dist-packages/fastllama/init.py:2 1 import os; ----> 2 from .api import * 4 set_library_path(os.path.dirname(os.path.abspath(file)))
File :1027, in find_and_load(name, import)
File :1006, in find_and_load_unlocked(name, import)
File :688, in _load_unlocked(spec)
File :879, in exec_module(self, module)
File :1016, in get_code(self, fullname)
File :1073, in get_data(self, path)
PermissionError: [Errno 13] Permission denied: '/usr/local/lib/python3.10/dist-packages/fastllama/api.py'
You're installing the package as a superuser. Therefore, you have to run the program as a superuser. If you want to run the program normally, I would suggest installing it as a normal user.
from fastllama.
@amitsingh19975 I'm a noob.
How do I run it with superuser?
from fastllama.
just add sudo before the command
sudo python example.py
from fastllama.
@amitsingh19975 Yep, nevermind, works as expected with sudo 👍
from fastllama.
Coolio, I guess we can close this issue in 2 days.
If any one faces any issues feel free to add it here or reopen if it has been closed
from fastllama.
@PotatoSpudowski Don't want to be a drag, but here is the dilema:
In order to use it in jupiter I need to run it NOT as a superuser (running cells as superuser in jupiter is painfull). I think a lot of people out there would prefer to use it that way. So to do that I have to stick with the old installation way (please, put it back to README for people like me) and from interfaces.python.fastllama import Model
as a way of importing it.
On the other hand it would be cool to install it with pip without sudo as a normal user (and without the need to manually add it to PATH).
from fastllama.
I'm assuming you're new to Unix based os, but it's not a issue on our side. You have to install python as normal user then you wouldn't face this problem. Once you install something in Unix based system as superuser, the only way to run a executable is running them as superuser.
There are a few ways to fix that if you're not keen on installing python again:
- Changing the permissions for user using
chmod
- Put the session in superuser mode using
sudo su
if my memory serves me right.
However, I would say fix you're python installation because it'll save your time in future.
from fastllama.
@amitsingh19975 You are assuming right :) Sorry once again, figured it out👍 (just created another env in conda).
from fastllama.
Installation works, but when I run the examples, I get:
ModuleNotFoundError: No module named 'fastllama.api'
Am I doing something wrong?
from fastllama.
Related Issues (20)
- Cmake Error HOT 1
- Cannot build this HOT 5
- from build.fastllama import Model, ModelKind ModuleNotFoundError: No module named 'build.fastllama' HOT 8
- convert-pth-to-ggml.py expects 2 parts for ALPACA-LORA-13B, but it has only one HOT 5
- Bad Magic error HOT 6
- When stop words are reached, they get ingested, but are not forwarded to streaming_fn. HOT 4
- Enabling custom logger makes it crash at ingestion. HOT 1
- TypeError: Model.generate() got an unexpected keyword argument 'stop_word' HOT 2
- Pip uninstall not removing the package HOT 2
- Designing the UI HOT 1
- Deciding the Schema for the protocol between webUI and webSocket Server HOT 2
- "No module named 'fastllama.api' " after pip installation HOT 10
- Implement the WebSocket Server
- Integrating + Testing webUI and WebSocket Server
- README.md is outdated in sections #running-llama and #running-alpaca-lora HOT 1
- how to load model in webui ? HOT 3
- Port llama.cpp openCL support to fastllama?
- Webui UX issue on mobile
- GGUF and/or LLama-3 support?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from fastllama.