Comments (3)
There is a PR for this already, which will be merged today. #2
This PR adds the model as input parameter for predict
from sagemaker-huggingface-inference-toolkit.
ok perfect. Yes indeed, with the current predict_fn(processed_data) the inference fails with a W-model-1-stdout com.amazonaws.ml.mms.wlm.WorkerLifeCycle - mms.service.PredictionException: name 'model' is not defined : 400
I'm using the below functions
def load_fn(model_dir):
"""this function reads the model from disk"""
print('load_fn dir view:')
print(os.listdir())
# load model
model = TFAutoModelForQuestionAnswering.from_pretrained('/opt/ml/model')
# load tokenizer
tokenizer = AutoTokenizer.from_pretrained('/opt/ml/model')
return model, tokenizer
def predict_fn(processed_data):
"""this function runs inference"""
print('processed_data received: ')
print(processed_data)
print('model name:')
print(model.name)
print('tok name:')
print(tokenizer.name)
question, text = processed_data['inputs']['question'], processed_data['inputs']['context']
input_dict = tokenizer(question, text, return_tensors='tf')
outputs = model(input_dict)
start_logits = outputs.start_logits
end_logits = outputs.end_logits
all_tokens = tokenizer.convert_ids_to_tokens(input_dict["input_ids"].numpy()[0])
answer = ' '.join(all_tokens[tf.math.argmax(start_logits, 1)[0] : tf.math.argmax(end_logits, 1)[0]+1])
return answer
from sagemaker-huggingface-inference-toolkit.
with the new predict_fn(processed_data, model)
things work fine as documented here #10 (comment)
from sagemaker-huggingface-inference-toolkit.
Related Issues (20)
- Using custom inference script and models from Hub HOT 1
- get_pipeline function passes Path object rather than PretrainedTokenizer
- No support for multi-GPU HOT 3
- 🏷️ invalid
- Sagemaker endpoint inferencing error with HF model loading from s3bucket with new transformer update HOT 5
- Support multiple return sequences
- HF_TASK Enviournment Variable error HOT 1
- Endpoint creation completes before custom model_fn finishes loading resources
- ARCHITECTURES_2_TASK is limiting the tasks able to be deployed with HF DLC HOT 11
- Make DEFAULT_HF_HUB_MODEL_EXPORT_DIRECTORY configurable
- InternalServerException at runtime HOT 3
- trust_remote_code=True in new Hugging Face LLM Inference Container for Amazon SageMaker HOT 2
- How to access CustomAttributes in async inferece request input_fn HOT 1
- [DOCS] List of available HF_TASK and default inference scripts HOT 4
- Dead Link for Available HF_Tasks HOT 1
- SageMaker deployment errors HOT 2
- Error on Sagemaker deployment for v1.0.1 HOT 1
- How can I delpoy a model with AWS S3 and without downloading model from hunggingface via TGI image on Sagemaker? HOT 2
- How to enable Batch inference on AWS deployed Serverless model from Hub? HOT 1
- Where is the logic for detecting custom inference.py? HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from sagemaker-huggingface-inference-toolkit.