Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Community Event] Docstring Sprint #26638

Closed
ydshieh opened this issue Oct 6, 2023 · 42 comments · Fixed by #26661, #26664, #26666, #26669 or #26674
Closed

[Community Event] Docstring Sprint #26638

ydshieh opened this issue Oct 6, 2023 · 42 comments · Fixed by #26661, #26664, #26666, #26669 or #26674

Comments

@ydshieh
Copy link
Collaborator

ydshieh commented Oct 6, 2023

Docstring is important to understand what inputs a function/method expect and the output format it returns !

This issue is part of the HACKTOBERFEST event 🔥 . It is a call for contributions with the goal being to help transformers having the required and correct docstring, so users can use it more smoothly.

Adding/Fixing a docstring is a simple (possibly first) contribution to Transformers and most importantly a very important contribution to the Transformers community ❤️ .

If you're interested in making a (maybe first!) contribution, please read through the Guide to contributing below. Before starting work on it, please reply in this thread which file you'd like to take :)

An example of such PR.

Guide to contributing:

  1. Ensure you've read our contributing guidelines 📜
  2. Be sure you installed the development dependencies with pip install -e ".[dev]", as described in the contributor guidelines above, to ensure that the code quality tools in make fixup can run.
  3. Look at the file utils/check_docstrings.py:
    • find the line OBJECTS_TO_IGNORE =
    • choose a name in the list OBJECTS_TO_IGNORE (make sure it is not taken yet by someone in the comments of this issue)
    • Claim the entry in this thread (confirm no one is working on it) 🎯
    • Let's select one single entry for a PR, or at most the entries from the same model architecture (its config/tokenizer/model/processor objects)
  4. Remove the selected item (in step 3.) from OBJECTS_TO_IGNORE
    • commit the changes
  5. run python3 utils/check_docstrings.py --fix_and_overwrite
    • You might see something like:
      • Screenshot 2023-10-06 174057
    • commit the changes
  6. fill information where <fill_type> or <fill_docstring> appear:
    • you can usually find the content to fill from other files (by searching the codebase)

    • compared to step 5.), the output now looks like:

      • Screenshot 2023-10-06 182359
    • commit the changes

  7. run utils/check_docstrings.py
    • make sure nothing change. Otherwise further work is required.
  8. run make fixup
  9. Open the PR
    • with the title having format [docstring] ... (entry name you work on) ...
    • wait CI to be green
    • otherwise, try to fix the failing tests if possible 🙏 . If necessary, ping me for helping on this.

Looking forward to your contributions 🔥 ❤️ 🚀 !

@joelsathi
Copy link

Hi @ydshieh !, I would like to work on this issue. Can you please assign me? From the list of OBJECTS_TO_IGNORE, I would like to work on "AlbertModel"

@AdwaitSalankar
Copy link
Contributor

AdwaitSalankar commented Oct 6, 2023

Hi! I would like to work on 'BertGenerationConfig'.

@pksX01
Copy link

pksX01 commented Oct 6, 2023

Hi @ydshieh !,
I would like to work on 'BartConfig'.

@neet-14
Copy link

neet-14 commented Oct 6, 2023

Hi @ydshieh, i would like to work on this issue, could assign me 'BertModel'.

@abzdel
Copy link
Contributor

abzdel commented Oct 6, 2023

I'll take DonutImageProcessor!

@McDonnellJoseph
Copy link
Contributor

McDonnellJoseph commented Oct 6, 2023

Hi @ydshieh I can take care of GPT2Config, GPT2Tokenize , GPT2TokenizerFast, WhisperTokenizerFast and WhisperTokenizer.

@Sparty
Copy link
Contributor

Sparty commented Oct 6, 2023

ZeroShotObjectDetectionPipeline

@Ayush-Balodi
Copy link

is this issue is open or closed?

@AVAniketh0905
Copy link
Contributor

AVAniketh0905 commented Oct 7, 2023

i would like to work on DPRConfig.

@shivanandmn
Copy link
Contributor

I will be working on SwinModel.

@KMJ-007
Copy link

KMJ-007 commented Oct 7, 2023

i would like to work on BartTokenizerFast, BarthezTokenizerFast, BertTokenizerFast, AlbertTokenizerFast, BigBirdTokenizerFast, BlenderbotSmallTokenizerFast, BlenderbotTokenizerFast

@pavaris-pm
Copy link
Contributor

I will be working on LlamaConfig.

@gizemt
Copy link
Contributor

gizemt commented Oct 7, 2023

I'll work on UniSpeechConfig, UniSpeechForCTC, UniSpeechSatConfig, UniSpeechSatForCTC.

Edit: Also working on Wav2Vec2ForCTC.

@mkobbi
Copy link

mkobbi commented Oct 7, 2023

Hi @ydshieh, I'd like to work on FlaxGPTNeoForCausalLM, FlaxGPTNeoModel, GPTNeoXConfig, and GPTNeoXTokenizerFast.

@KyleGrande
Copy link

Interested in working on AzureOpenAiAgent and Blip2VisionConfig . Just those for now maybe more further down the line.

@Bojun-Feng
Copy link
Contributor

Bojun-Feng commented Oct 8, 2023

I would like to work on CodeLlamaTokenizer and CodeLlamaTokenizerFast

Update Oct 13: Also working on RwkvConfig

@pksX01
Copy link

pksX01 commented Oct 8, 2023

I am trying to install dependencies using python3 -m pip install -e ".[dev]" command in a separate conda environment on macbook m1 air but I am getting different errors different times. For example:

ERROR: Could not find a version that satisfies the requirement tensorflow-text<2.15; extra == "dev" (from transformers[dev]) (from versions: none)
ERROR: No matching distribution found for tensorflow-text<2.15; extra == "dev"

ERROR: Could not find a version that satisfies the requirement decord==0.6.0; extra == "dev" (from transformers[dev]) (from versions: none)
ERROR: No matching distribution found for decord==0.6.0; extra == "dev"

@ydshieh Could you please help me here.

@AVAniketh0905
Copy link
Contributor

I am trying to install dependencies using python3 -m pip install -e ".[dev]" command in a separate conda environment on macbook m1 air but I am getting different errors different times. For example:...

I am facing a similar problem i have referenced it here #26656

@Bojun-Feng
Copy link
Contributor

Bojun-Feng commented Oct 8, 2023

I am trying to install dependencies using python3 -m pip install -e ".[dev]" command in a separate conda environment on macbook m1 air but I am getting different errors different times. For example:...

I am also facing a similar problem, also Mac with m1 chip. Referenced at #26666

@pksX01
Copy link

pksX01 commented Oct 14, 2023

When I ran python3 -m pip install -e ".[dev-torch]" --user command, I got below error:

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
analytics-python 1.4.0 requires backoff==1.10.0, but you have backoff 1.11.1 which is incompatible.
fastai 2.7.12 requires torch<2.1,>=1.7, but you have torch 2.1.0 which is incompatible.
wandb 0.13.3 requires protobuf<4.0dev,>=3.12.0, but you have protobuf 4.24.4 which is incompatible.

After this, I ran make fixup and got below error:

No library .py files were modified
python utils/custom_init_isort.py
python utils/sort_auto_mappings.py
doc-builder style src/transformers docs/source --max_len 119 --path_to_docs docs/source
make: doc-builder: No such file or directory
make: *** [extra_style_checks] Error 1

Is second error caused because of first error?

@ydshieh, @NielsRogge Could you please help me here. Sorry for asking many questions. I am new to transformers library and open source in general.

@daniilgaltsev
Copy link
Contributor

Hei! I will work on CodeGen entries: CodeGenConfig, CodeGenTokenizer, CodeGenTokenizerFast.

@ydshieh
Copy link
Collaborator Author

ydshieh commented Oct 16, 2023

When I ran python3 -m pip install -e ".[dev-torch]" --user command, I got below error:

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
analytics-python 1.4.0 requires backoff==1.10.0, but you have backoff 1.11.1 which is incompatible.
fastai 2.7.12 requires torch<2.1,>=1.7, but you have torch 2.1.0 which is incompatible.
wandb 0.13.3 requires protobuf<4.0dev,>=3.12.0, but you have protobuf 4.24.4 which is incompatible.

After this, I ran make fixup and got below error:

No library .py files were modified
python utils/custom_init_isort.py
python utils/sort_auto_mappings.py
doc-builder style src/transformers docs/source --max_len 119 --path_to_docs docs/source
make: doc-builder: No such file or directory
make: *** [extra_style_checks] Error 1

Is second error caused because of first error?

@ydshieh, @NielsRogge Could you please help me here. Sorry for asking many questions. I am new to transformers library and open source in general.

Maybe try with a new fresh python environment, preferably with miniconda and with a python version 3.8.x

@przemL
Copy link
Contributor

przemL commented Oct 16, 2023

I'll work on BertJapaneseTokenizerand BitImageProcessor.

@ydshieh ydshieh reopened this Oct 16, 2023
helboukkouri pushed a commit to helboukkouri/transformers that referenced this issue Oct 16, 2023
* [docstring] Remove 'BertGenerationConfig' from OBJECTS_TO_IGNORE

* [docstring] Fix docstring for 'BertGenerationConfig' (huggingface#26638)
@louietouie
Copy link
Contributor

I will be working on LukeConfig, thanks! #26858

@HaoES
Copy link

HaoES commented Oct 17, 2023

I will be working on BertConfig

@HaoES
Copy link

HaoES commented Oct 17, 2023

@ydshieh can you please help:

  • I forked the repo
  • cloned it to my laptop
  • used pip install -e ".[dev]"
  • removed the line "BertConfig", from the specified file
    then when I ran python3 utils/check_docstrings.py --fix_and_overwrite I got the following huge error:
2023-10-17 16:05:04.226199: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-10-17 16:05:04.226417: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-10-17 16:05:04.227616: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2023-10-17 16:05:05.517032: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Fixing the docstring of BertConfig in /home/kiriyama/Projects/transformers/src/transformers/models/bert/configuration_bert.py.
Could not load the custom kernel for multi-scale deformable attention: Error building extension 'MultiScaleDeformableAttention': [1/4] c++ -MMD -MF ms_deform_attn_cpu.o.d -DTORCH_EXTENSION_NAME=MultiScaleDeformableAttention -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -I/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/TH -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/THC -isystem /home/kiriyama/mambaforge/include -isystem /home/kiriyama/mambaforge/include/python3.10 -D_GLIBCXX_USE_CXX11_ABI=0 -fPIC -std=c++17 -DWITH_CUDA=1 -c /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cpu/ms_deform_attn_cpu.cpp -o ms_deform_attn_cpu.o
[2/4] c++ -MMD -MF vision.o.d -DTORCH_EXTENSION_NAME=MultiScaleDeformableAttention -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -I/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/TH -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/THC -isystem /home/kiriyama/mambaforge/include -isystem /home/kiriyama/mambaforge/include/python3.10 -D_GLIBCXX_USE_CXX11_ABI=0 -fPIC -std=c++17 -DWITH_CUDA=1 -c /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/vision.cpp -o vision.o
In file included from /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/vision.cpp:11:
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/ms_deform_attn.h: In function ‘at::Tensor ms_deform_attn_forward(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, int)’:
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/ms_deform_attn.h:29:20: warning: ‘at::DeprecatedTypeProperties& at::Tensor::type() const’ is deprecated: Tensor.type() is deprecated. Instead use Tensor.options(), which in many cases (e.g. in a constructor) is a drop-in replacement. If you were using data from type(), that is now available from Tensor itself, so instead of tensor.type().scalar_type(), use tensor.scalar_type() instead and instead of tensor.type().backend() use tensor.device(). [-Wdeprecated-declarations]
   29 |     if (value.type().is_cuda())
      |                    ^
In file included from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/Tensor.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/Tensor.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/autograd/function_hook.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/autograd/cpp_hook.h:2,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/autograd/variable.h:6,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/autograd/autograd.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/autograd.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/all.h:7,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/extension.h:4,
                 from /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cpu/ms_deform_attn_cpu.h:12,
                 from /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/ms_deform_attn.h:13,
                 from /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/vision.cpp:11:
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:222:30: note: declared here
  222 |   DeprecatedTypeProperties & type() const {
      |                              ^~~~
In file included from /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/vision.cpp:11:
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/ms_deform_attn.h: In function ‘std::vector<at::Tensor> ms_deform_attn_backward(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, int)’:
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/ms_deform_attn.h:51:20: warning: ‘at::DeprecatedTypeProperties& at::Tensor::type() const’ is deprecated: Tensor.type() is deprecated. Instead use Tensor.options(), which in many cases (e.g. in a constructor) is a drop-in replacement. If you were using data from type(), that is now available from Tensor itself, so instead of tensor.type().scalar_type(), use tensor.scalar_type() instead and instead of tensor.type().backend() use tensor.device(). [-Wdeprecated-declarations]
   51 |     if (value.type().is_cuda())
      |                    ^
In file included from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/Tensor.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/Tensor.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/autograd/function_hook.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/autograd/cpp_hook.h:2,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/autograd/variable.h:6,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/autograd/autograd.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/autograd.h:3,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/all.h:7,
                 from /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/extension.h:4,
                 from /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cpu/ms_deform_attn_cpu.h:12,
                 from /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/ms_deform_attn.h:13,
                 from /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/vision.cpp:11:
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:222:30: note: declared here
  222 |   DeprecatedTypeProperties & type() const {
      |                              ^~~~
[3/4] /home/kiriyama/mambaforge/bin/nvcc  -DTORCH_EXTENSION_NAME=MultiScaleDeformableAttention -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -I/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/TH -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/THC -isystem /home/kiriyama/mambaforge/include -isystem /home/kiriyama/mambaforge/include/python3.10 -D_GLIBCXX_USE_CXX11_ABI=0 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -gencode=arch=compute_61,code=compute_61 -gencode=arch=compute_61,code=sm_61 --compiler-options '-fPIC' -DCUDA_HAS_FP16=1 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ -std=c++17 -c /home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu -o ms_deform_attn_cuda.cuda.o
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:19:9: warning: #pragma once in main file
   19 | #pragma once
      |         ^~~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_im2col_cuda.cuh(261): warning #177-D: variable "q_col" was declared but never referenced
          detected during instantiation of "void ms_deformable_im2col_cuda(cudaStream_t, const scalar_t *, const int64_t *, const int64_t *, const scalar_t *, const scalar_t *, int, int, int, int, int, int, int, scalar_t *) [with scalar_t=double]"
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu(67): here

/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_im2col_cuda.cuh(762): warning #177-D: variable "q_col" was declared but never referenced
          detected during instantiation of "void ms_deformable_col2im_cuda(cudaStream_t, const scalar_t *, const scalar_t *, const int64_t *, const int64_t *, const scalar_t *, const scalar_t *, int, int, int, int, int, int, int, scalar_t *, scalar_t *, scalar_t *) [with scalar_t=double]"
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu(137): here

/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_im2col_cuda.cuh(872): warning #177-D: variable "q_col" was declared but never referenced
          detected during instantiation of "void ms_deformable_col2im_cuda(cudaStream_t, const scalar_t *, const scalar_t *, const int64_t *, const int64_t *, const scalar_t *, const scalar_t *, int, int, int, int, int, int, int, scalar_t *, scalar_t *, scalar_t *) [with scalar_t=double]"
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu(137): here

/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_im2col_cuda.cuh(331): warning #177-D: variable "q_col" was declared but never referenced
          detected during instantiation of "void ms_deformable_col2im_cuda(cudaStream_t, const scalar_t *, const scalar_t *, const int64_t *, const int64_t *, const scalar_t *, const scalar_t *, int, int, int, int, int, int, int, scalar_t *, scalar_t *, scalar_t *) [with scalar_t=double]"
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu(137): here

/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_im2col_cuda.cuh(436): warning #177-D: variable "q_col" was declared but never referenced
          detected during instantiation of "void ms_deformable_col2im_cuda(cudaStream_t, const scalar_t *, const scalar_t *, const int64_t *, const int64_t *, const scalar_t *, const scalar_t *, int, int, int, int, int, int, int, scalar_t *, scalar_t *, scalar_t *) [with scalar_t=double]"
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu(137): here

/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_im2col_cuda.cuh(544): warning #177-D: variable "q_col" was declared but never referenced
          detected during instantiation of "void ms_deformable_col2im_cuda(cudaStream_t, const scalar_t *, const scalar_t *, const int64_t *, const int64_t *, const scalar_t *, const scalar_t *, int, int, int, int, int, int, int, scalar_t *, scalar_t *, scalar_t *) [with scalar_t=double]"
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu(137): here

/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_im2col_cuda.cuh(649): warning #177-D: variable "q_col" was declared but never referenced
          detected during instantiation of "void ms_deformable_col2im_cuda(cudaStream_t, const scalar_t *, const scalar_t *, const int64_t *, const int64_t *, const scalar_t *, const scalar_t *, int, int, int, int, int, int, int, scalar_t *, scalar_t *, scalar_t *) [with scalar_t=double]"
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu(137): here

/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:19:9: warning: #pragma once in main file
   19 | #pragma once
      |         ^~~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu: In lambda function:
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:86: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |                                                                                      ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:172: warning: ‘T* at::Tensor::data() const [with T = long int]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |
                                                           ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:215: warning: ‘T* at::Tensor::data() const [with T = long int]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |
                                                                                                      ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:248: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |

               ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:331: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |

                                                                                                  ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:489: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |



                ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu: In lambda function:
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:85: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |                                                                                     ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:171: warning: ‘T* at::Tensor::data() const [with T = long int]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |
                                                          ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:214: warning: ‘T* at::Tensor::data() const [with T = long int]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |
                                                                                                     ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:246: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |

             ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:328: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |

                                                                                               ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:67:485: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
   67 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_forward_cuda", ([&] {
      |



            ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu: In lambda function:
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:94: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |                                                                                              ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:120: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |
       ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:206: warning: ‘T* at::Tensor::data() const [with T = long int]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |
                                                                                             ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:249: warning: ‘T* at::Tensor::data() const [with T = long int]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |

                ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:282: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |

                                                 ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:365: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |


            ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:526: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |



                                                     ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:610: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |




                 ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:698: warning: ‘T* at::Tensor::data() const [with T = double]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |




                                                                                                         ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu: In lambda function:
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:93: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |                                                                                             ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:118: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |
     ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:204: warning: ‘T* at::Tensor::data() const [with T = long int]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |
                                                                                           ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:247: warning: ‘T* at::Tensor::data() const [with T = long int]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |

              ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:279: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |

                                              ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:361: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |


        ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:521: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |



                                                ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:604: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |




           ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
/home/kiriyama/Projects/transformers/src/transformers/kernels/deformable_detr/cuda/ms_deform_attn_cuda.cu:137:691: warning: ‘T* at::Tensor::data() const [with T = float]’ is deprecated: Tensor.data<T>() is deprecated. Please use Tensor.data_ptr<T>() instead. [-Wdeprecated-declarations]
  137 |         AT_DISPATCH_FLOATING_TYPES(value.type(), "ms_deform_attn_backward_cuda", ([&] {
      |




                                                                                                  ^
/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:244:1: note: declared here
  244 |   T * data() const {
      | ^ ~~
[4/4] c++ vision.o ms_deform_attn_cpu.o ms_deform_attn_cuda.cuda.o -shared -L/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/lib -lc10 -lc10_cuda -ltorch_cpu -ltorch_cuda -ltorch -ltorch_python -L/home/kiriyama/mambaforge/lib64 -lcudart -o MultiScaleDeformableAttention.so
FAILED: MultiScaleDeformableAttention.so
c++ vision.o ms_deform_attn_cpu.o ms_deform_attn_cuda.cuda.o -shared -L/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/lib -lc10 -lc10_cuda -ltorch_cpu -ltorch_cuda -ltorch -ltorch_python -L/home/kiriyama/mambaforge/lib64 -lcudart -o MultiScaleDeformableAttention.so
/usr/bin/ld: cannot find -lcudart
collect2: error: ld returned 1 exit status
ninja: build stopped: subcommand failed.

Using /home/kiriyama/.cache/torch_extensions/py310_cu118 as PyTorch extensions root...
Creating extension directory /home/kiriyama/.cache/torch_extensions/py310_cu118/cuda_kernel...
Detected CUDA files, patching ldflags
Emitting ninja build file /home/kiriyama/.cache/torch_extensions/py310_cu118/cuda_kernel/build.ninja...
Building extension module cuda_kernel...
Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N)
[1/4] /home/kiriyama/mambaforge/bin/nvcc  -DTORCH_EXTENSION_NAME=cuda_kernel -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/TH -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/THC -isystem /home/kiriyama/mambaforge/include -isystem /home/kiriyama/mambaforge/include/python3.10 -D_GLIBCXX_USE_CXX11_ABI=0 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -gencode=arch=compute_61,code=compute_61 -gencode=arch=compute_61,code=sm_61 --compiler-options '-fPIC' -std=c++17 -c /home/kiriyama/Projects/transformers/src/transformers/kernels/mra/cuda_kernel.cu -o cuda_kernel.cuda.o
[2/4] c++ -MMD -MF torch_extension.o.d -DTORCH_EXTENSION_NAME=cuda_kernel -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/TH -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/THC -isystem /home/kiriyama/mambaforge/include -isystem /home/kiriyama/mambaforge/include/python3.10 -D_GLIBCXX_USE_CXX11_ABI=0 -fPIC -std=c++17 -c /home/kiriyama/Projects/transformers/src/transformers/kernels/mra/torch_extension.cpp -o torch_extension.o
[3/4] /home/kiriyama/mambaforge/bin/nvcc  -DTORCH_EXTENSION_NAME=cuda_kernel -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/TH -isystem /home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/include/THC -isystem /home/kiriyama/mambaforge/include -isystem /home/kiriyama/mambaforge/include/python3.10 -D_GLIBCXX_USE_CXX11_ABI=0 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -gencode=arch=compute_61,code=compute_61 -gencode=arch=compute_61,code=sm_61 --compiler-options '-fPIC' -std=c++17 -c /home/kiriyama/Projects/transformers/src/transformers/kernels/mra/cuda_launch.cu -o cuda_launch.cuda.o
[4/4] c++ cuda_kernel.cuda.o cuda_launch.cuda.o torch_extension.o -shared -L/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/lib -lc10 -lc10_cuda -ltorch_cpu -ltorch_cuda -ltorch -ltorch_python -L/home/kiriyama/mambaforge/lib64 -lcudart -o cuda_kernel.so
FAILED: cuda_kernel.so
c++ cuda_kernel.cuda.o cuda_launch.cuda.o torch_extension.o -shared -L/home/kiriyama/mambaforge/lib/python3.10/site-packages/torch/lib -lc10 -lc10_cuda -ltorch_cpu -ltorch_cuda -ltorch -ltorch_python -L/home/kiriyama/mambaforge/lib64 -lcudart -o cuda_kernel.so
/usr/bin/ld: cannot find -lcudart
collect2: error: ld returned 1 exit status
ninja: build stopped: subcommand failed.
Failed to load CUDA kernels. Mra requires custom CUDA kernels. Please verify that compatible versions of PyTorch and CUDA Toolkit are installed: Error building extension 'cuda_kernel'

@ydshieh
Copy link
Collaborator Author

ydshieh commented Oct 17, 2023

what is your torch version?

@HaoES
Copy link

HaoES commented Oct 17, 2023

what is your torch version?

it's 2.0.1

@Sparty
Copy link
Contributor

Sparty commented Oct 17, 2023

ChineseCLIPImageProcessor, ChineseCLIPTextConfig, ChineseCLIPVisionConfig

@AksharGoyal
Copy link
Contributor

AksharGoyal commented Oct 18, 2023

I will take AltCLIPVisionConfig, AltCLIPTextConfig

@AksharGoyal
Copy link
Contributor

AksharGoyal commented Oct 18, 2023

@ydshieh I ran make fixup as instructed and I got the following output.

make fixup
fatal: Not a valid object name main
Traceback (most recent call last):
  File "/workspaces/transformers/utils/get_modified_files.py", line 27, in <module>
    fork_point_sha = subprocess.check_output("git merge-base main HEAD".split()).decode("utf-8")
  File "/usr/local/python/3.10.8/lib/python3.10/subprocess.py", line 421, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/usr/local/python/3.10.8/lib/python3.10/subprocess.py", line 526, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['git', 'merge-base', 'main', 'HEAD']' returned non-zero exit status 128.
No library .py files were modified
python utils/custom_init_isort.py
python utils/sort_auto_mappings.py
doc-builder style src/transformers docs/source --max_len 119 --path_to_docs docs/source
python utils/check_doc_toc.py --fix_and_overwrite
running deps_table_update
updating src/transformers/dependency_versions_table.py
python utils/check_copies.py
python utils/check_table.py
python utils/check_dummies.py
python utils/check_repo.py
2023-10-18 20:17:59.925820: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-10-18 20:17:59.925873: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-10-18 20:17:59.925929: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2023-10-18 20:18:00.934413: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Checking all models are included.
Checking all models are public.
Checking all models are properly tested.
Checking all objects are properly documented.
Checking all models are in at least one auto class.
Checking all names in auto name mappings are defined.
Checking all keys in auto name mappings are defined in `CONFIG_MAPPING_NAMES`.
Checking all auto mappings could be imported.
Checking all objects are equally (across frameworks) in the main __init__.
Using TensorFlow backend
/workspaces/transformers/src/transformers/deepspeed.py:23: FutureWarning: transformers.deepspeed module is deprecated and will be removed in a future version. Please import deepspeed modules directly from transformers.integrations
  warnings.warn(
/workspaces/transformers/src/transformers/generation_flax_utils.py:24: FutureWarning: Importing `FlaxGenerationMixin` from `src/transformers/generation_flax_utils.py` is deprecated and will be removed in Transformers v5. Import as `from transformers import FlaxGenerationMixin` instead.
  warnings.warn(
/workspaces/transformers/src/transformers/generation_tf_utils.py:24: FutureWarning: Importing `TFGenerationMixin` from `src/transformers/generation_tf_utils.py` is deprecated and will be removed in Transformers v5. Import as `from transformers import TFGenerationMixin` instead.
  warnings.warn(
/workspaces/transformers/src/transformers/generation_utils.py:24: FutureWarning: Importing `GenerationMixin` from `src/transformers/generation_utils.py` is deprecated and will be removed in Transformers v5. Import as `from transformers import GenerationMixin` instead.
  warnings.warn(
No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)
Checking the DEPRECATED_MODELS constant is up to date.
python utils/check_inits.py
python utils/check_config_docstrings.py
python utils/check_config_attributes.py
python utils/check_doctest_list.py
python utils/update_metadata.py --check-only
2023-10-18 20:18:17.099792: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-10-18 20:18:17.099854: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-10-18 20:18:17.099911: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2023-10-18 20:18:18.165840: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
python utils/check_task_guides.py
python utils/check_docstrings.py
2023-10-18 20:18:24.343502: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-10-18 20:18:24.343555: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-10-18 20:18:24.343613: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2023-10-18 20:18:25.399133: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Traceback (most recent call last):
  File "/workspaces/transformers/utils/check_docstrings.py", line 1238, in <module>
    check_docstrings(overwrite=args.fix_and_overwrite)
  File "/workspaces/transformers/utils/check_docstrings.py", line 1230, in check_docstrings
    raise ValueError(error_message)
ValueError: There was at least one problem when checking docstrings of public objects.
The following objects docstrings do not match their signature. Run `make fix-copies` to fix this.
- TFRegNetForImageClassification
- TFRegNetModel
make: *** [Makefile:46: repo-consistency] Error 1

I would like to know if this is the expected output.

These are the details after running transformers-cli env:

- `transformers` version: 4.35.0.dev0
- Platform: Linux-6.2.0-1014-azure-x86_64-with-glibc2.31
- Python version: 3.10.8
- Huggingface_hub version: 0.17.3
- Safetensors version: 0.4.0
- Accelerate version: 0.23.0
- Accelerate config:    not found
- PyTorch version (GPU?): 2.1.0+cu121 (False)
- Tensorflow version (GPU?): 2.14.0 (False)
- Flax version (CPU?/GPU?/TPU?): 0.7.0 (cpu)
- Jax version: 0.4.13
- JaxLib version: 0.4.13

I did run make fix-copies and the following was the output

$ make fix-copies
python utils/check_copies.py --fix_and_overwrite
python utils/check_table.py --fix_and_overwrite
python utils/check_dummies.py --fix_and_overwrite
python utils/check_doctest_list.py --fix_and_overwrite
python utils/check_task_guides.py --fix_and_overwrite
python utils/check_docstrings.py --fix_and_overwrite
2023-10-18 20:56:23.959231: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-10-18 20:56:23.959590: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-10-18 20:56:23.959717: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2023-10-18 20:56:24.963094: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT

@ydshieh
Copy link
Collaborator Author

ydshieh commented Oct 19, 2023

I guess you have ninja installed. Unfortunately, MRA has some issue with the custom kernel, see this comment

Either having a new fresh python environment, or uninstall ninja in the existing env should work.

@Sparty
Copy link
Contributor

Sparty commented Oct 20, 2023

ErnieConfig, ErnieMConfig

blbadger pushed a commit to blbadger/transformers that referenced this issue Nov 8, 2023
* [docstring] Remove 'BertGenerationConfig' from OBJECTS_TO_IGNORE

* [docstring] Fix docstring for 'BertGenerationConfig' (huggingface#26638)
EduardoPach pushed a commit to EduardoPach/transformers that referenced this issue Nov 18, 2023
* [docstring] Remove 'BertGenerationConfig' from OBJECTS_TO_IGNORE

* [docstring] Fix docstring for 'BertGenerationConfig' (huggingface#26638)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment