Comments (2)
Fixed in 73715fe. Released as v0.5.2.
from bumblebee.
Ran into a similar issue loading a fine-tuned Mistral model:
{:ok, model_info} = Bumblebee.load_model({:hf, "NousResearch/Nous-Hermes-2-Mistral-7B-DPO"}, backend: {EXLA.Backend, client: :host})
Error:
** (FunctionClauseError) no function clause matching in Unpickler.load_op/3
The following arguments were given to Unpickler.load_op/3:
# 1
240
# 2
<<39, 0, 0, 0, 0, 0, 0, 123, 34, 95, 95, 109, 101, 116, 97, 100, 97, 116, 97, 95, 95, 34, 58, 123,
34, 102, 111, 114, 109, 97, 116, 34, 58, 34, 112, 116, 34, 125, 44, 34, 108, 109, 95, 104, 101,
97, 100, 46, 119, 101, ...>>
# 3
%{stack: [], persistent_id_resolver: nil, object_resolver: nil, refs: %{}, memo: %{}, metastack: []}
...
(unpickler 0.1.0) lib/unpickler.ex:240: Unpickler.load_op/3
(bumblebee 0.5.1) lib/bumblebee/conversion/pytorch/loader.ex:192: Bumblebee.Conversion.PyTorch.Loader.load_legacy!/1
(bumblebee 0.5.1) lib/bumblebee/conversion/pytorch.ex:48: anonymous fn/2 in Bumblebee.Conversion.PyTorch.load_params!/4
(elixir 1.15.6) lib/enum.ex:1693: Enum."-map/2-lists^map/1-1-"/2
(bumblebee 0.5.1) lib/bumblebee/conversion/pytorch.ex:47: anonymous fn/4 in Bumblebee.Conversion.PyTorch.load_params!/4
(nx 0.7.0) lib/nx.ex:4447: Nx.with_default_backend/2
(bumblebee 0.5.1) lib/bumblebee.ex:607: Bumblebee.load_params/5
However managed to load a prior version of this model: teknium/OpenHermes-2-Mistral-7B
from bumblebee.
Related Issues (20)
- `TextEmbedding` crashes when both Mean Pooling and `compile` opts is specified HOT 2
- Featurizer different from python? HOT 10
- Weird behaviour with progress status. HOT 2
- Error when using TinyLlama HOT 2
- Confidences/probabilities for Whisper results
- Parameter persistence with sharding support HOT 4
- Tied word embeddings HOT 2
- [Documentation] Image struct is not correct in Livebook Examples HOT 2
- Token Classification error in Livebook HOT 1
- Running Whisper using bf16 fails HOT 1
- Bumblebee error when serving Mistral LLM with latest 0.5.0 release HOT 1
- Llama 2 derivative model errors expecting top_k to be provided HOT 2
- CUDA 12.2 support HOT 3
- Huggingface repository not found - `bert-base-google-bert` HOT 1
- Add support for "google/gemma-7b-it" HOT 1
- Just download a model from HuggingFace? HOT 5
- Support Mixtral HOT 4
- llama3 requires 2 eos_token_id's HOT 1
- Error loading gpt3.5 tokenizer HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from bumblebee.