Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: filename pattern [prompt_hash] doesn't include negative prompt or other parameters #12637

Closed
1 task done
Dr-Turtle opened this issue Aug 18, 2023 · 4 comments
Closed
1 task done
Labels
bug-report Report of a bug, yet to be confirmed

Comments

@Dr-Turtle
Copy link

Is there an existing issue for this?

  • I have searched the existing issues and checked the recent builds/commits

What happened?

[prompt_hash] only changes when the prompt changes, not other parameters like negative prompt or resolution are changed.

I assumed that prompt_hash should account for other generation parameters, given that there's no equivalent option for negative prompts (and other parameters)

Steps to reproduce the problem

  1. add [prompt_hash] in the "saving to a directory">"directory name pattern" setting, and click apply
  2. generate an image
  3. generate an image with the same prompt, and different negative prompt

The result will be 2 files with the same prompt_hash in their name/path

What should have happened?

The prompt_hash should change when anything affecting generation is changed (including negative prompt, resolution, etc.)

Instead of changing this behavior, I think a new pattern should be added that accounts for everything that influences the image, in case others prefer it the way it is.

Version or Commit where the problem happens

1.5.1

What Python version are you running on ?

Python 3.11.x (above, no supported yet)

What platforms do you use to access the UI ?

Linux

What device are you running WebUI on?

Nvidia GPUs (GTX 16 below)

Cross attention optimization

Automatic

What browsers do you use to access the UI ?

Mozilla Firefox

Command Line Arguments

--lowvram

List of extensions

No

Console logs

[turtle@homepc1 21:22 stable-diffusion-webui]$ ./webui.sh --lowvram

################################################################
Install script for stable-diffusion + Web UI
Tested on Debian 11 (Bullseye)
################################################################

################################################################
Running on turtle user
################################################################

################################################################
Repo already cloned, using it as install directory
################################################################

################################################################
Create and activate python venv
################################################################

################################################################
Launching launch.py...
################################################################
Using TCMalloc: libtcmalloc_minimal.so.4
Python 3.11.3 (main, Jun  5 2023, 09:32:32) [GCC 13.1.1 20230429]
Version: v1.5.1
Commit hash: 68f336bd994bed5442ad95bad6b6ad5564a5409a
Launching Web UI with arguments: --lowvram
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
No module 'xformers'. Proceeding without it.
Loading weights [f915efad12] from /home/turtle/automatic1111/stable-diffusion-webui/models/Stable-diffusion/a7b3_v10.safetensors
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
Startup time: 9.1s (launcher: 2.5s, import torch: 3.0s, import gradio: 0.8s, setup paths: 0.8s, other imports: 0.7s, load scripts: 0.5s, create ui: 0.6s, gradio launch: 0.2s).
Creating model from config: /home/turtle/automatic1111/stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Applying attention optimization: Doggettx... done.
Model loaded in 3.7s (load weights from disk: 1.0s, create model: 0.5s, apply weights to model: 1.8s, apply half(): 0.3s, calculate empty prompt: 0.1s).
100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:46<00:00,  2.32s/it]
Total progress: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:44<00:00,  2.21s/it]
^CInterrupted with signal 2 in <frame at 0x7f7032e4f220, file '/usr/lib/python3.11/threading.py', line 324, code wait>██████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:44<00:00,  2.28s/it]

Additional information

No response

@Dr-Turtle Dr-Turtle added the bug-report Report of a bug, yet to be confirmed label Aug 18, 2023
@w-e-w
Copy link
Collaborator

w-e-w commented Aug 18, 2023

it's not really a bug but I made a PR to maybe fit maybe your use case
see #12639

I think generation parameter hash isn't that practical because some extensions might do stuff that are not tracked in the parameters which will be hard to account
so I added another pattern for the image_hash, if you see two images of the same image hash then you can basically say they are the same image regardless of whatever process it has gone through

@Dr-Turtle
Copy link
Author

it's not really a bug but I made a PR to maybe fit maybe your use case see #12639

I think generation parameter hash isn't that practical because some extensions might do stuff that are not tracked in the parameters which will be hard to account so I added another pattern for the image_hash, if you see two images of the same image hash then you can basically say they are the same image regardless of whatever process it has gone through

Thanks for the quick response and PR, tbh I wasn't expecting anything to come of this issue given the number of open issues :P

@w-e-w
Copy link
Collaborator

w-e-w commented Aug 19, 2023

PR has been merged in to dev

@w-e-w w-e-w closed this as completed Aug 19, 2023
@w-e-w
Copy link
Collaborator

w-e-w commented Aug 19, 2023

Thanks for the quick response and PR, tbh I wasn't expecting anything to come of this issue given the number of open issues :P

the issue is still functional
I myself in a couple of others keep a close watch
you can see most issues has comments

and most of the issues are just people asking for help for nstallation

since we didn't set up an automated job to close state issues we have to go through the manually and that is a very tedious work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-report Report of a bug, yet to be confirmed
Projects
None yet
Development

No branches or pull requests

2 participants