Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cant run example - right access problem #416

Closed
evgenyigumnov opened this issue Aug 12, 2023 · 5 comments
Closed

Cant run example - right access problem #416

evgenyigumnov opened this issue Aug 12, 2023 · 5 comments

Comments

@evgenyigumnov
Copy link
Contributor

Execute: cargo run --example llama

Have error:

Running on CPU, to run on GPU, build this example with --features cuda
loading the model weights from meta-llama/Llama-2-7b-hf
Error: request error: https://huggingface.co/meta-llama/Llama-2-7b-hf/resolve/main/tokenizer.json: status code 401

I have acc https://huggingface.co/ievnsk and create token and after create token file: C:\Users\igumn.cache\huggingface\token execute again. Have new error:

loading the model weights from meta-llama/Llama-2-7b-hf
Error: I/O error Клиент не обладает требуемыми правами. (os error 1314)

Caused by:
Клиент не обладает требуемыми правами. (os error 1314)

Stack backtrace:
0: backtrace::backtrace::dbghelp::trace
at C:\Users\igumn.cargo\registry\src\index.crates.io-6f17d22bba15001f\backtrace-0.3.68\src\backtrace\dbghelp.rs:98
1: backtrace::backtrace::trace_unsynchronizedanyhow::backtrace::capture::impl$4::create::closure_env$0
at C:\Users\igumn.cargo\registry\src\index.crates.io-6f17d22bba15001f\backtrace-0.3.68\src\backtrace\mod.rs:66
2: backtrace::backtrace::traceanyhow::backtrace::capture::impl$4::create::closure_env$0
at C:\Users\igumn.cargo\registry\src\index.crates.io-6f17d22bba15001f\backtrace-0.3.68\src\backtrace\mod.rs:53
3: anyhow::backtrace::capture::Backtrace::create
at C:\Users\igumn.cargo\registry\src\index.crates.io-6f17d22bba15001f\anyhow-1.0.72\src\backtrace.rs:216
4: anyhow::backtrace::capture::Backtrace::capture
at C:\Users\igumn.cargo\registry\src\index.crates.io-6f17d22bba15001f\anyhow-1.0.72\src\backtrace.rs:204
5: anyhow::error::impl$1::from<enum2$<hf_hub::api::sync::ApiError> >
at C:\Users\igumn.cargo\registry\src\index.crates.io-6f17d22bba15001f\anyhow-1.0.72\src\error.rs:547
6: core::result::impl$27::from_residual<tuple$<>,enum2$<hf_hub::api::sync::ApiError>,anyhow::Error>
at /rustc/eb26296b556cef10fb713a38f3d16b9886080f26\library\core\src\result.rs:1961
7: llama::main
at .\candle-examples\examples\llama\main.rs:168

@Narsil
Copy link
Collaborator

Narsil commented Aug 12, 2023

Did you get authorized on https://huggingface.co/meta-llama/Llama-2-7b-hf ? That model is gated, you need to be approved by accepting the license to use it.

@Narsil Narsil closed this as completed Aug 12, 2023
@evgenyigumnov
Copy link
Contributor Author

I think I authorized because when I open https://huggingface.co/meta-llama/Llama-2-7b-hf/resolve/main/tokenizer.json link in my browser it starting dowload

@evgenyigumnov
Copy link
Contributor Author

On page https://huggingface.co/meta-llama/Llama-2-7b-hf
I have message: Gated model You have been granted access to this model

@evgenyigumnov
Copy link
Contributor Author

I could generate temporaty token for you if you need

@evgenyigumnov
Copy link
Contributor Author

Here is my readonly temporary token for test: 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

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants