Comments (5)
Hi, can you cd inside "build" and run
"cmake .." and then "make"
Please share the logs and we can help debug!
from fastllama.
Hey! I recloned the repo and there is no build
folder. I ran setup.py which made the build folder. Here is everything that happened in the terminal
PS C:\Users\SumYin\Desktop\llama\fastLLaMa> cd build
cd : Cannot find path 'C:\Users\SumYin\Desktop\llama\fastLLaMa\build' because it does not exist.
At line:1 char:1
+ cd build
+ ~~~~~~~~
+ CategoryInfo : ObjectNotFound: (C:\Users\SumYin...fastLLaMa\build:String) [Set-Location], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.SetLocationCommand
PS C:\Users\SumYin\Desktop\llama\fastLLaMa> python setup.py
An error occurred while running 'make': Command '['cd ./build && cmake .. && make']' returned non-zero exit status 1.
Output:
PS C:\Users\SumYin\Desktop\llama\fastLLaMa> cd build
PS C:\Users\SumYin\Desktop\llama\fastLLaMa\build> cmake ..
-- Building for: NMake Makefiles
CMake Error at CMakeLists.txt:2 (project):
Running
'nmake' '-?'
failed with:
The system cannot find the file specified
CMake Error: CMAKE_C_COMPILER not set, after EnableLanguage
CMake Error: CMAKE_CXX_COMPILER not set, after EnableLanguage
-- Configuring incomplete, errors occurred!
See also "C:/Users/SumYin/Desktop/llama/fastLLaMa/build/CMakeFiles/CMakeOutput.log".
PS C:\Users\SumYin\Desktop\llama\fastLLaMa\build>
I am not sure what you mean by logs, are they stored some where?
from fastllama.
- Check if you have Visual Studio with "Desktop development with C++" workload if not you can download https://visualstudio.microsoft.com/downloads
- Can you try using Developer Command Prompt.
- Check if windows environment variables for C/C++ is set properly.
I am yet to test this out on windows. If you get it working else please share the errors
from fastllama.
I'm having the same error. I'm sure 1 and 3 are not issues as I have alpaca and built that with cmake, tried 2 no dice.
When I load it in VisualStudio and try and built with cmake through there I get:
Severity Code Description Project File Line Suppression State
Warning D9002 ignoring unknown option '-O3' C:\AI\fastLLaMa\out\build\x64-Debug\fastLLaMa C:\AI\fastLLaMa\out\build\x64-Debug\cl 1
Error C7555 use of designated initializers requires at least '/std:c++20' C:\AI\fastLLaMa\out\build\x64-Debug\fastLLaMa C:\AI\fastLLaMa\bridge.cpp 237
Error C7555 use of designated initializers requires at least '/std:c++20' C:\AI\fastLLaMa\out\build\x64-Debug\fastLLaMa C:\AI\fastLLaMa\bridge.cpp 574
Error C7555 use of designated initializers requires at least '/std:c++20' C:\AI\fastLLaMa\out\build\x64-Debug\fastLLaMa C:\AI\fastLLaMa\bridge.cpp 579
Error C2039 'accumulate': is not a member of 'std' C:\AI\fastLLaMa\out\build\x64-Debug\fastLLaMa C:\AI\fastLLaMa\bridge.cpp 967
Error C3861 'accumulate': identifier not found C:\AI\fastLLaMa\out\build\x64-Debug\fastLLaMa C:\AI\fastLLaMa\bridge.cpp 967
Error C2737 'stop_token_len': const object must be initialized C:\AI\fastLLaMa\out\build\x64-Debug\fastLLaMa C:\AI\fastLLaMa\bridge.cpp 967
Error C3536 'stop_token_len': cannot be used before it is initialized C:\AI\fastLLaMa\out\build\x64-Debug\fastLLaMa C:\AI\fastLLaMa\bridge.cpp 971
from fastllama.
Can you build using the refactor branch? The errors you're getting indeed c++ errors not msvc.
from fastllama.
Related Issues (20)
- Cmake Error HOT 1
- Cannot build this HOT 5
- Pip support testing HOT 21
- from build.fastllama import Model, ModelKind ModuleNotFoundError: No module named 'build.fastllama' HOT 8
- convert-pth-to-ggml.py expects 2 parts for ALPACA-LORA-13B, but it has only one HOT 5
- Bad Magic error HOT 6
- When stop words are reached, they get ingested, but are not forwarded to streaming_fn. HOT 4
- Enabling custom logger makes it crash at ingestion. HOT 1
- TypeError: Model.generate() got an unexpected keyword argument 'stop_word' HOT 2
- Pip uninstall not removing the package HOT 2
- Designing the UI HOT 1
- Deciding the Schema for the protocol between webUI and webSocket Server HOT 2
- "No module named 'fastllama.api' " after pip installation HOT 10
- Implement the WebSocket Server
- Integrating + Testing webUI and WebSocket Server
- README.md is outdated in sections #running-llama and #running-alpaca-lora HOT 1
- how to load model in webui ? HOT 3
- Port llama.cpp openCL support to fastllama?
- Webui UX issue on mobile
- GGUF and/or LLama-3 support?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from fastllama.