You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I encountered the error that optimization is never terminated when using LoRA
You can see from the below image that the processing time is more than 2000s.
I used the same image that was well reproduced in Issue #2
Other parameters are the same as the default.
If I ran without LoRA, it
Terminal message is as below:
Running on local URL: ...
Running on public URL: ...
This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from Terminal to deploy to Spaces (https://huggingface.co/spaces)
You are using a model of type clip_text_model to instantiate a model of type . This is not supported for all configurations of models and can yield errors.
/<PATH>/miniconda3/envs/sdedrag/lib/python3.9/site-packages/diffusers/models/attention_processor.py:1946: FutureWarning: `LoRAAttnProcessor2_0` is deprecated and will be removed in version 0.26.0. Make sure use AttnProcessor2_0 instead by settingLoRA layers to `self.{to_q,to_k,to_v,to_out[0]}.lora_layer` respectively. This will be done automatically when using `LoraLoaderMixin.load_lora_weights`
deprecate(
I think it can be due to the environment (package version conflict), so can you distribute a more precise environment file? Dependency, you provided, already caused some errors, since this repo does not support the newest Gradio version.
I reproduced in this environment, but still LoRA doesn't working.
I think it is largely because the process is terminated, sde-drag will show a progress bar when the lora is trained. You can try to run the code again.
Hello,
I encountered the error that optimization is never terminated when using LoRA
You can see from the below image that the processing time is more than 2000s.
I used the same image that was well reproduced in Issue #2
Other parameters are the same as the default.
If I ran without LoRA, it
Terminal message is as below:
I think it can be due to the environment (package version conflict), so can you distribute a more precise environment file?
Dependency, you provided, already caused some errors, since this repo does not support the newest Gradio version.
I reproduced in this environment, but still LoRA doesn't working.
The text was updated successfully, but these errors were encountered: