Text Generation
Transformers
Safetensors
falcon
text-generation-inference

Cannot get access with token

#9
by ShimJL - opened

I am already sussessfully logged in by using command huggingface-cli login, but when I am running official script, it shows 401 Client Error: Unauthorized for url: https://huggingface.co/tiiuae/falcon-180B/resolve/main/tokenizer_config.json. Is model currently unavailable or something is not working?

this is the detail
You are trying to access a gated repo.
Make sure to request access at https://huggingface.co/tiiuae/falcon-180b and pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>.
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/tiiuae/falcon-180B/resolve/main/tokenizer_config.json

The above exception was the direct cause of the following exception:

huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-64f9aa93-414a0d3c0edd540a1046c05f;e1b40200-a0eb-41af-ae76-9eb3254416f8)

Cannot access gated repo for url https://huggingface.co/tiiuae/falcon-180B/resolve/main/tokenizer_config.json.
Repo model tiiuae/falcon-180B is gated. You must be authenticated to access it.

The above exception was the direct cause of the following exception:

Same error for me, it does not work with the alternatives suggested here:
https://huggingface.co/docs/hub/security-tokens

I tried:

access_token = "hf_..."
model = AutoModel.from_pretrained("private/model", token=access_token)

and logging in via huggingface-cli login or huggingface-cli login --token hf_....

OSError: tiiuae/falcon-180b is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True.

i used my own huggingaface token, still issue persists. i am on azure databricks.

Please help me !
I got the same error, I've try everything but still not work :
OSError: You are trying to access a gated repo.
Make sure to request access at https://huggingface.co/tiiuae/falcon-180b-chat and pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>.

@ShimJL @necro666 @phdykd @Amadou95 I'm not from the TII team, but do you guys request access to the repo with the same account of the access token? Is the token you passed a "read" or "write" token?

For me write token, and yes it's the same account as this one, i got only one huggingFace account

For me write token, and yes it's the same account as this one, i got only one huggingFace account

Did you agree to access repo with your account?

yes ! i got this in the page of Falcon (https://huggingface.co/tiiuae/falcon-180B-chat) : " Gated model : You have been granted access to this model "

That's also the same message in https://huggingface.co/tiiuae/falcon-180B

yes ! i got this in the page of Falcon (https://huggingface.co/tiiuae/falcon-180B-chat) : " Gated model : You have been granted access to this model "

That's super weird, can you try restarting your terminal/Python IDE/whatever you are using to run the Python script after running huggingface-cli login and then running the script you are trying to run?

I just try what you say : For login i got
Token can be pasted using 'Right-Click'.
Token:
Add token as git credential? (Y/n) Y
Token is valid (permission: write).
Your token has been saved in your configured git credential helpers (manager-core).
Your token has been saved to C:\Users\Papes.cache\huggingface\token
Login successful

After i close and reopen terminal, but i still have no access, same message :
OSError: You are trying to access a gated repo.
Make sure to request access at https://huggingface.co/tiiuae/falcon-180b-chat and pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>.

I just try what you say : For login i got
Token can be pasted using 'Right-Click'.
Token:
Add token as git credential? (Y/n) Y
Token is valid (permission: write).
Your token has been saved in your configured git credential helpers (manager-core).
Your token has been saved to C:\Users\Papes.cache\huggingface\token
Login successful

After i close and reopen terminal, but i still have no access, same message :
OSError: You are trying to access a gated repo.
Make sure to request access at https://huggingface.co/tiiuae/falcon-180b-chat and pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>.

Can you try adding

from huggingface_hub import login
login()

before accessing the repo in the Python script?

I just try, here is start of my code :

from transformers import AutoTokenizer
import transformers
import torch
from huggingface_hub import login
login()

model = "tiiuae/falcon-180b-chat"

tokenizer = AutoTokenizer.from_pretrained(model)

After that, my terminal asked me my huggingface token and i've got Login successful
But i steel have the same error

Any update on resolving the access issues?

Not yet, I've tried to created a new HuggingFace account, accept the condtions to use Falcon, create 2 news Tokens (read and write), but i've have same error message, I've tried with both token and still got nothing.

I'm steel blocked !

Getting same error in VS code. Tried huggingface-cli login --token MyHFToken. Login was successful but after running sample code getting below error.

OSError: You are trying to access a gated repo.
Make sure to request access at https://huggingface.co/tiiuae/falcon-180b and pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>.

Yup, I'm seeing the same issue.

SOLVED: you need to get the capitalization of the model name correct: "tiiuae/falcon-180B", not "tiiuae/falcon-180b"

@YokaiKoibito THANKS A LOT !!!!!!
That work for me !

deleted
This comment has been hidden

SOLVED: you need to get the capitalization of the model name correct: "tiiuae/falcon-180B", not "tiiuae/falcon-180b"

This worked like a charm !!! Thank you.

ShimJL changed discussion status to closed

HTTPError: Invalid user token. If you didn't pass a user token, make sure you are properly logged in by executing huggingface-cli login, and if you did pass a user token, double-check it's correct.
for AutoModelForTokenClassification.from_pretrained("distilbert/distilbert-base-uncased", num_labels=len(all_labels), label2id=label2id, id2label=id2label,use_auth_token=True)

is google colab enough for running this model?

Sign up or log in to comment