-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torchmeta import 오류 #117
Comments
!pip install torchmeta |
@junlee94 안녕하세요. 아래의 절차대로 진행해보시겠어요? 느낌표 있는 부분들만 셀에 입력하시면 됩니다. 1. 버전 확인!python --version
Python 3.10.12
!cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04.3 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.3 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy 2. Python 3.9 설치!apt-get update && apt-get install -y python3.9 python3-pip python3.9-distutils
!which python3.9
/usr/bin/python3.9 3. 심볼릭 링크 만들기!python3 --version
Python 3.10.12
!rm /usr/bin/python3
!ln -s /usr/bin/python3.9 /usr/bin/python3
!python3 --version
Python 3.9.18 4. Torchmeta 설치!pip install torchmeta
...
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
WARNING: The following packages were previously imported in this runtime:
[certifi]
You must restart the runtime in order to use newly installed versions.
취소 누르시면 됩니다. 5. GPU 사용 확인!nvidia-smi
Thu Mar 7 12:03:02 2024
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.104.05 Driver Version: 535.104.05 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 Tesla T4 Off | 00000000:00:04.0 Off | 0 |
| N/A 37C P8 9W / 70W | 0MiB / 15360MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| No running processes found |
+---------------------------------------------------------------------------------------+ import torch
print(torch.cuda.is_available()) # True |
This issue has been automatically marked as stale because it has been open 20 days |
This issue was automatically closed because of stale in 10 days |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
아나콘다 프롬프트로 패키지 설치까지는 완료하였는데, 이후 jupyter notebook에서 입력해보니 torchmeta를 import하지 못하는 오류가 발생하고 있습니다.
혹시 해결방법이 있을까요?
The text was updated successfully, but these errors were encountered: