GPU Memory
& Inference Time
check according to sequence length
- transformers_version: 4.11.3
- framework: PyTorch
- use_torchscript: False
- framework_version: 1.7.1
- python_version: 3.6.13
- system: Linux
- cpu: x86_64
- architecture: 64bit
- fp16: False
- use_multiprocessing: True
- only_pretrain_model: False
- cpu_ram_mb: 15717
- use_gpu: True
- num_gpus: 1
- gpu: Tesla T4
- gpu_ram_mb: 15109
- gpu_power_watts: 70.0
- gpu_performance_state: 0
- use_tpu: False