Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: alternating words with an empty option throws an exception #8161

Closed
1 task done
rubberbaron opened this issue Feb 27, 2023 · 3 comments
Closed
1 task done

[Bug]: alternating words with an empty option throws an exception #8161

rubberbaron opened this issue Feb 27, 2023 · 3 comments
Labels
bug Report of a confirmed bug

Comments

@rubberbaron
Copy link

Is there an existing issue for this?

  • I have searched the existing issues and checked the recent builds/commits

What happened?

Using an empty option in an alternation, like "[red|] apple", throws an exception. Easy workaround is to put a space in the empty option, like "[red| ] apple", and then it doesn't error out.

Steps to reproduce the problem

  1. type "[red|] apple" as the prompt
  2. Press generate

What should have happened?

does not throw an exception

Commit where the problem happens

d5ce044

What platforms do you use to access the UI ?

Windows

What browsers do you use to access the UI ?

Mozilla Firefox

Command Line Arguments

xformers

List of extensions

wildcards, controlnet

Console logs

Error completing request
Arguments: ('task(wz2r40k8hiugpjz)', '[|red] thing', '\n', [], 70, 0, True, False, 1, 1, 7, -1.0, -1.0, 0, 0, 0, False, 400, 700, True, 0.15, 2, '4x_foolhardy_Remacri', 20, 0, 0, 0, False, 'canny', 'control_sd15_canny(fef5e48e)', 0.1, None, False, 'Scale to Fit (Inner Fit)', False, False, False, False, False, False, '', '', '', 1, '', 0, '', 0, '', True, False, False, False) {}
Traceback (most recent call last):
  File "D:\ai\sdweb\modules\prompt_parser.py", line 76, in alternate
    yield next(args[(step - 1)%len(args)])
StopIteration

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "D:\ai\sdweb\venv\lib\site-packages\lark\visitors.py", line 116, in _call_userfunc
    return f(children)
  File "D:\ai\sdweb\modules\prompt_parser.py", line 84, in start
    return ''.join(flatten(args))
  File "D:\ai\sdweb\modules\prompt_parser.py", line 83, in flatten
    yield from flatten(gen)
  File "D:\ai\sdweb\modules\prompt_parser.py", line 83, in flatten
    yield from flatten(gen)
  File "D:\ai\sdweb\modules\prompt_parser.py", line 82, in flatten
    for gen in x:
RuntimeError: generator raised StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\ai\sdweb\modules\call_queue.py", line 56, in f
    res = list(func(*args, **kwargs))
  File "D:\ai\sdweb\modules\call_queue.py", line 37, in f
    res = func(*args, **kwargs)
  File "D:\ai\sdweb\modules\txt2img.py", line 52, in txt2img
    processed = process_images(p)
  File "D:\ai\sdweb\modules\processing.py", line 476, in process_images
    res = process_images_inner(p)
  File "D:\ai\sdweb\modules\processing.py", line 604, in process_images_inner
    c = get_conds_with_caching(prompt_parser.get_multicond_learned_conditioning, prompts, p.steps, cached_c)
  File "D:\ai\sdweb\modules\processing.py", line 562, in get_conds_with_caching
    cache[1] = function(shared.sd_model, required_prompts, steps)
  File "D:\ai\sdweb\modules\prompt_parser.py", line 205, in get_multicond_learned_conditioning
    learned_conditioning = get_learned_conditioning(model, prompt_flat_list, steps)
  File "D:\ai\sdweb\scripts\prompt_blending.py", line 40, in hijacked_get_learned_conditioning
    return real_get_learned_conditioning(model, switched_prompts, steps)
  File "D:\ai\sdweb\modules\prompt_parser.py", line 129, in get_learned_conditioning
    prompt_schedules = get_learned_conditioning_prompt_schedules(prompts, steps)
  File "D:\ai\sdweb\modules\prompt_parser.py", line 102, in get_learned_conditioning_prompt_schedules
    promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
  File "D:\ai\sdweb\modules\prompt_parser.py", line 102, in <dictcomp>
    promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
  File "D:\ai\sdweb\modules\prompt_parser.py", line 100, in get_schedule
    return [[t, at_step(t, tree)] for t in collect_steps(steps, tree)]
  File "D:\ai\sdweb\modules\prompt_parser.py", line 100, in <listcomp>
    return [[t, at_step(t, tree)] for t in collect_steps(steps, tree)]
  File "D:\ai\sdweb\modules\prompt_parser.py", line 90, in at_step
    return AtStep().transform(tree)
  File "D:\ai\sdweb\venv\lib\site-packages\lark\visitors.py", line 153, in transform
    return self._transform_tree(tree)
  File "D:\ai\sdweb\venv\lib\site-packages\lark\visitors.py", line 149, in _transform_tree
    return self._call_userfunc(tree, children)
  File "D:\ai\sdweb\venv\lib\site-packages\lark\visitors.py", line 120, in _call_userfunc
    raise VisitError(tree.data, tree, e)
lark.exceptions.VisitError: Error trying to process rule "start":

generator raised StopIteration

Additional information

No response

@rubberbaron rubberbaron added the bug-report Report of a bug, yet to be confirmed label Feb 27, 2023
@Raivshard
Copy link

can you double-check your prompt? I had the same issue, and then noticed that there was an extra divider | after the last altnernating word. As soon as I removed it, the error stopped.

@rubberbaron
Copy link
Author

The point is that the | at the end is there intentionally, it's trying to choose the empty string intentionally. It is a bug that you can't do that.

@rubberbaron
Copy link
Author

As a trivial example of why you might want this, consider [fe|]male. Yes, obviously you can write [female|male], but it gets at the reason for wanting it.

I looked into the code, but since the grammar uses a CFG with an LALR parser it looks pretty complicated to support; you'd have to add an empty production and then adjust rules to include that as an option, and I'm not familiar enough with Lark to make those sorts of changes myself. (It would be trivial to handle in some alternative parsing engines, but it doesn't make sense to switch parsers just for this.)

@catboxanon catboxanon added bug Report of a confirmed bug and removed bug-report Report of a bug, yet to be confirmed labels Aug 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Report of a confirmed bug
Projects
None yet
Development

No branches or pull requests

3 participants