Replies: 1 comment 1 reply
-
The real error is further up, but its likely a VRAM limitation. If you are on KoboldAI United try turning 4-bit on at the bottom of the sliders. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I got
You are using a model of type gptj to instantiate a model of type gpt_neo. This is not supported for all configurations of models and can yield errors.
when loading Pygmalion-6b model; and the loading fail with few more errors.
It seems like it related to configuration, any idea how to fix it?
(I'm running on RTX 3080 TI, Win11)
Beta Was this translation helpful? Give feedback.
All reactions