Comments (4)
Hi @greyblue9 ,
Are you sure you're correctly passing the token ?
We recently changed rules, to be less silent about incorrect tokens especially when formatting was forgotten.
We used to simply pass request as unauthenticated but it caused issues for some users not realizing they had made simple typos and couldn't access their plan features
f"Bearer {API_TOKEN}" #works
# vs
"Bearer {API_TOKEN}" # doesn't work, no formatting, `"{API_TOKEN}"` is actually invalid.
Both old and new tokens should work the same way by the way.
Sorry about this issue.
from api-inference-community.
Hi @greyblue9 ,
Both old and new tokens should work the same way by the way.
Thank you so much for your response. I actually created a new account with a new (write) token, and that worked ๐ค
What I was using previously, the token was much longer (about 3x longer and had no api_
or hf_
prefix on it, and I was using f"Bearer: api_{token}"
)
Maybe the :
was the issue?
from api-inference-community.
Probably, we really need to get Bearer {API_TOKEN}
!
Glad you could work it out.
from api-inference-community.
I will close this, don ยดt hesitate to open again if you think it's incorrect.
from api-inference-community.
Related Issues (20)
- Endpoint interface for inpainting models that require two images. HOT 5
- Doesn't work with the bart-large-cnn model HOT 1
- What is the ratelimit for inference api for pro users? HOT 1
- Hosted Inference Api, all models error 422 HOT 6
- pydantic.errors.PydanticImportError: `pydantic:ConstrainedFloat` has been removed in V2. HOT 4
- How do we use the detailed Parameters for the api? HOT 1
- Many of the docker images seem to be out of sync with latest inference community version HOT 7
- Update docker images to latest version of api-inference-community version
- Update docker images to latest version of api-inference-community version HOT 2
- meta-llama/Llama-2-70b-chat-hf Inference API shows incpmplete output HOT 1
- About using Hosted Inference API
- Proper parameters for `HuggingFaceM4/idefics-9b-instruct` HOT 1
- An exception occurs when running the NER model. HOT 1
- Return max_input_length if sequence is too long (NLP tasks) HOT 9
- No image-to-text task in pipelines!
- [Bug] Audio task accept headers are not respected HOT 2
- any pipeline using huggingface_hub.model_info is not offline compatible
- Adding End-Of-Generation-Token parameter for text generation Inference API
- 1xbet
- Bumping docker version of SpeechBrain? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from api-inference-community.