Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: h11._util.LocalProtocolError: Can't send data when our state is ERROR #10625

Open
1 task done
hellangleZ opened this issue May 22, 2023 · 18 comments
Open
1 task done
Labels
bug-report Report of a bug, yet to be confirmed

Comments

@hellangleZ
Copy link

hellangleZ commented May 22, 2023

Is there an existing issue for this?

  • I have searched the existing issues and checked the recent builds/commits

What happened?

It very strong that when use webui , it could not see the problem, but when use API, it usually release this issue

Exception in callback H11Protocol.timeout_keep_alive_handler()
handle: <TimerHandle when=30411.431255625 H11Protocol.timeout_keep_alive_handler()>

File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/h11/_connection.py", line 483, in send_with_data_passthrough
raise LocalProtocolError("Can't send data when our state is ERROR")

Steps to reproduce the problem

Just common call the API

What should have happened?

It is very stange, sometimes it will happen, but in some times , the same prompt, it could work good

Commit where the problem happens

Common call api

What platforms do you use to access the UI ?

Linux

What browsers do you use to access the UI ?

Microsoft Edge

Command Line Arguments

(no background:1.30),(There is only 1 Africa black boy:1.4), (front_view, full_body:1.50), cartoon ,with a yellow Lakers basketball shirt and blue shorts standing with his hands in his pockets and his hair in the air
, best quality, 8K,( extreme detail description:1.4), (sharp focus:1.4), <lora:zby-50k-000010:0.5>

List of extensions

No extension

Console logs

Traceback (most recent call last):
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 109, in __call__
    await response(scope, receive, send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 662, in __aexit__
    raise exceptions[0]
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/responses.py", line 273, in wrap
    await func()
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 134, in stream_response
    return await super().stream_response(send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/responses.py", line 255, in stream_response
    await send(
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 159, in _send
    await send(message)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 513, in send
    output = self.conn.send(event)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/h11/_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/h11/_connection.py", line 483, in send_with_data_passthrough
    raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 429, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/fastapi/applications.py", line 273, in __call__
    await super().__call__(scope, receive, send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 109, in __call__
    await response(scope, receive, send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 662, in __aexit__
    raise exceptions[0]
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/responses.py", line 273, in wrap
    await func()
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 134, in stream_response
    return await super().stream_response(send)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/responses.py", line 255, in stream_response
    await send(
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 159, in _send
    await send(message)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 513, in send
    output = self.conn.send(event)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/h11/_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "/aml/stable-diffusion-webui/venv/lib/python3.10/site-packages/h11/_connection.py", line 483, in send_with_data_passthrough
    raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR

Additional information

I just check a lot of issue, but not see same as mine, any expert please help me to fix the issue, Thanks

@hellangleZ hellangleZ added the bug-report Report of a bug, yet to be confirmed label May 22, 2023
@hellangleZ hellangleZ changed the title [Bug]: LocalProtocalError(Can't send data when our state is ERROR) [Bug]: h11._util.LocalProtocolError: Can't send data when our state is ERROR May 22, 2023
@hellangleZ
Copy link
Author

I saw the problem should impact by ./stable-diffusion-webui/venv/lib/python3.10/site-packages/uvicorn/main.py

There is a configuration:

change it from 5 to 0

will resolve this issue

image

@montyanderson
Copy link
Contributor

@hellangleZ, I've made a pull request that implements your feedback but inside api.py #10625

@shiertier
Copy link

shiertier commented May 22, 2023

@hellangleZ, I've made a pull request that implements your feedback but inside api.py #10625

I tried using the method you changed api.py, but it didn't work and the error was still reported.The following is the complete error.

This results in multiple iterations of loading the model, but eventually it can be used (or not, and needs to be reopened).
In addition, if this problem occurs, prompt word completion will probably not be used.
Before five hours ago, I never encountered this problem, but now it triggers almost all the time.I haven't changed the Settings or installed new plug-ins.The program is installed in docker.I'm kind of wondering why is this a problem.

Exception in callback H11Protocol.timeout_keep_alive_handler()
handle: <TimerHandle when=7379774.222416322 H11Protocol.timeout_keep_alive_handler()>
Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_state.py", line 249, in _fire_event_triggered_transitions
    new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type]
KeyError: <class 'h11._events.ConnectionClosed'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/conda/envs/sd/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 383, in timeout_keep_alive_handler
    self.conn.send(event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 493, in send_with_data_passthrough
    self._process_event(self.our_role, event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 242, in _process_event
    self._cstate.process_event(role, type(event), server_switch_event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_state.py", line 238, in process_event
    self._fire_event_triggered_transitions(role, event_type)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_state.py", line 251, in _fire_event_triggered_transitions
    raise LocalProtocolError(
h11._util.LocalProtocolError: can't handle event type ConnectionClosed when role=SERVER and state=SEND_RESPONSE
API error: GET: https://525f113fd2f59ddd54.gradio.live/file=/tmp/gradio/de5d9aab9c3a9cf7b2368bd5593f771ecf038c96/Martin Schongauer.jpg {'error': 'LocalProtocolError', 'detail': '', 'body': '', 'errors': "Can't send data when our state is ERROR"}
╭───────────────────────────────────────────────────────────────────────────────── Traceback (most recent call last) ──────────────────────────────────────────────────────────────────────────────────╮
│ /conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/errors.py:162 in __call__                                                                                                           │
│                                                                                                                                                                                                      │
│ /conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/base.py:109 in __call__                                                                                                             │
│                                                                                                                                                                                                      │
│                                                                                       ... 7 frames hidden ...                                                                                        │
│                                                                                                                                                                                                      │
│ /conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py:468 in send                                                                                                                           │
│                                                                                                                                                                                                      │
│   467 │   │   """                                                                                                                                                                                    │
│ ❱ 468 │   │   data_list = self.send_with_data_passthrough(event)                                                                                                                                     │
│   469 │   │   if data_list is None:                                                                                                                                                                  │
│                                                                                                                                                                                                      │
│ ╭───────────────────────────────────────────────────────────────────────────────────────────── locals ─────────────────────────────────────────────────────────────────────────────────────────────╮ │
│ │ event = Response(status_code=200, headers=<Headers([(b'date', b'Mon, 22 May 2023 19:57:46 GMT'), (b'server', b'uvicorn'), (b'accept-ranges', b'bytes'), (b'content-type', b'image/jpeg'),        │ │
│ │         (b'last-modified', b'Sun, 21 May 2023 18:57:08 GMT'), (b'etag', b'd3de113fba97994cb51338c528b67228'), (b'content-encoding', b'gzip'), (b'vary', b'Accept-Encoding'), (b'x-process-time', │ │
│ │         b'4.0266')])>, http_version=b'1.1', reason=b'OK')                                                                                                                                        │ │
│ │  self = <h11._connection.Connection object at 0x7f3a80ed4940>                                                                                                                                    │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                                                                                                                                      │
│ /conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py:483 in send_with_data_passthrough                                                                                                     │
│                                                                                                                                                                                                      │
│   482 │   │   if self.our_state is ERROR:                                                                                                                                                            │
│ ❱ 483 │   │   │   raise LocalProtocolError("Can't send data when our state is ERROR")                                                                                                                │
│   484 │   │   try:                                                                                                                                                                                   │
│                                                                                                                                                                                                      │
│ ╭───────────────────────────────────────────────────────────────────────────────────────────── locals ─────────────────────────────────────────────────────────────────────────────────────────────╮ │
│ │ event = Response(status_code=200, headers=<Headers([(b'date', b'Mon, 22 May 2023 19:57:46 GMT'), (b'server', b'uvicorn'), (b'accept-ranges', b'bytes'), (b'content-type', b'image/jpeg'),        │ │
│ │         (b'last-modified', b'Sun, 21 May 2023 18:57:08 GMT'), (b'etag', b'd3de113fba97994cb51338c528b67228'), (b'content-encoding', b'gzip'), (b'vary', b'Accept-Encoding'), (b'x-process-time', │ │
│ │         b'4.0266')])>, http_version=b'1.1', reason=b'OK')                                                                                                                                        │ │
│ │  self = <h11._connection.Connection object at 0x7f3a80ed4940>                                                                                                                                    │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
LocalProtocolError: Can't send data when our state is ERROR
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 428, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/conda/envs/sd/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/conda/envs/sd/lib/python3.10/site-packages/fastapi/applications.py", line 273, in __call__
    await super().__call__(scope, receive, send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/base.py", line 109, in __call__
    await response(scope, receive, send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 662, in __aexit__
    raise exceptions[0]
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/responses.py", line 273, in wrap
    await func()
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/base.py", line 134, in stream_response
    return await super().stream_response(send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/responses.py", line 255, in stream_response
    await send(
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/errors.py", line 159, in _send
    await send(message)
  File "/conda/envs/sd/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 512, in send
    output = self.conn.send(event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 483, in send_with_data_passthrough
    raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR
Applying cross attention optimization (Doggettx).
Textual inversion embeddings loaded(15): bad-artist, bad-artist-anime, bad-hands-5, bad-image-v2-39000, bad_pictures, bad_prompt_version2, badhandv4, BadNegAnatomyV1-neg, By bad artist -neg, EasyNegative, easynegative, EasyNegativeV2, ng_deepnegative_v1_75t, rmadanegative4_sd15-neg, verybadimagenegative_v1.3
Textual inversion embeddings skipped(1): 21charturnerv2
Model loaded in 46.6s (load weights from disk: 22.0s, create model: 1.3s, apply weights to model: 8.7s, apply half(): 7.9s, apply dtype to VAE: 0.1s, load VAE: 1.8s, move model to device: 4.2s, load textual inversion embeddings: 0.5s).
Loading weights [None] from /stable-diffusion-webui/models/Stable-diffusion/3d/chikmix_V3.safetensors
Creating model from config: /stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Loading VAE weights from commandline argument: /stable-diffusion-webui/models/VAE/vae-ft-mse-840000-ema-pruned.ckpt
Applying cross attention optimization (Doggettx).
Model loaded in 53.0s (load weights from disk: 24.2s, create model: 1.6s, apply weights to model: 8.6s, apply half(): 9.7s, load VAE: 2.2s, move model to device: 6.6s, load textual inversion embeddings: 0.1s).
Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/routes.py", line 408, in run_predict
    output = await app.get_blocks().process_api(
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/blocks.py", line 1315, in process_api
    result = await self.call_function(
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/blocks.py", line 1043, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/stable-diffusion-webui/modules/ui_extra_networks.py", line 296, in refresh
    pg.refresh()
  File "/stable-diffusion-webui/modules/ui_extra_networks_textual_inversion.py", line 13, in refresh
    sd_hijack.model_hijack.embedding_db.load_textual_inversion_embeddings(force_reload=True)
  File "/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 230, in load_textual_inversion_embeddings
    self.expected_shape = self.get_expected_shape()
  File "/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 137, in get_expected_shape
    vec = shared.sd_model.cond_stage_model.encode_embedding_init_text(",", 1)
  File "/stable-diffusion-webui/modules/sd_hijack_clip.py", line 315, in encode_embedding_init_text
    embedded = embedding_layer.token_embedding.wrapped(ids.to(embedding_layer.token_embedding.wrapped.weight.device)).squeeze(0)
  File "/conda/envs/sd/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1265, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'Embedding' object has no attribute 'wrapped'
Loading weights [None] from /stable-diffusion-webui/models/Stable-diffusion/3d/chikmix_V3.safetensors
Creating model from config: /stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Loading VAE weights from commandline argument: /stable-diffusion-webui/models/VAE/vae-ft-mse-840000-ema-pruned.ckpt
Applying cross attention optimization (Doggettx).
Model loaded in 39.8s (load weights from disk: 15.9s, create model: 1.2s, apply weights to model: 8.9s, apply half(): 5.2s, load VAE: 0.9s, move model to device: 7.6s, load textual inversion embeddings: 0.1s).
Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/routes.py", line 408, in run_predict
    output = await app.get_blocks().process_api(
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/blocks.py", line 1315, in process_api
    result = await self.call_function(
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/blocks.py", line 1043, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/stable-diffusion-webui/modules/ui_extra_networks.py", line 296, in refresh
    pg.refresh()
  File "/stable-diffusion-webui/modules/ui_extra_networks_textual_inversion.py", line 13, in refresh
    sd_hijack.model_hijack.embedding_db.load_textual_inversion_embeddings(force_reload=True)
  File "/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 230, in load_textual_inversion_embeddings
    self.expected_shape = self.get_expected_shape()
  File "/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 137, in get_expected_shape
    vec = shared.sd_model.cond_stage_model.encode_embedding_init_text(",", 1)
  File "/stable-diffusion-webui/modules/sd_hijack_clip.py", line 315, in encode_embedding_init_text
    embedded = embedding_layer.token_embedding.wrapped(ids.to(embedding_layer.token_embedding.wrapped.weight.device)).squeeze(0)
  File "/conda/envs/sd/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1265, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'Embedding' object has no attribute 'wrapped'
Loading weights [None] from /stable-diffusion-webui/models/Stable-diffusion/3d/chikmix_V3.safetensors
Creating model from config: /stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Loading VAE weights from commandline argument: /stable-diffusion-webui/models/VAE/vae-ft-mse-840000-ema-pruned.ckpt
Applying cross attention optimization (Doggettx).
Model loaded in 48.2s (load weights from disk: 15.7s, create model: 1.3s, apply weights to model: 8.7s, apply half(): 10.8s, load VAE: 1.5s, move model to device: 10.1s, load textual inversion embeddings: 0.2s).

txt2img: girls
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:08<00:00,  2.27it/s]
Total progress: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:03<00:00,  5.14it/s]
Total progress: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:03<00:00,  `6.23it/s]

@super3
Copy link

super3 commented May 23, 2023

@hellangleZ @shiertier Tried this fix as well, and it didn't solve the issue.

@AUTOMATIC1111 Any ideas here? We are really stuck.

@wzgrx
Copy link
Contributor

wzgrx commented May 24, 2023

This problem still exists

LocalProtocolError: Can't send data when our state is ERROR

ERROR: Exception in ASGI application
Traceback (most recent call last):
File "D:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 428, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "D:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call
return await self.app(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\fastapi\applications.py", line 273, in call
await super().call(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\applications.py", line 122, in call
await self.middleware_stack(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in call
raise exc
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in call
await self.app(scope, receive, _send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in call
await response(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in call
async with anyio.create_task_group() as task_group:
File "D:\stable-diffusion-webui\venv\lib\site-packages\anyio_backends_asyncio.py", line 662, in aexit
raise exceptions[0]
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap
await func()
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response
return await super().stream_response(send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response
await send(
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
await send(message)
File "D:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 512, in send
output = self.conn.send(event)
File "D:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 468, in send
data_list = self.send_with_data_passthrough(event)
File "D:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 483, in send_with_data_passthrough
raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR

@hellangleZ
Copy link
Author

@hellangleZ, I've made a pull request that implements your feedback but inside api.py #10625

I tried using the method you changed api.py, but it didn't work and the error was still reported.The following is the complete error.

This results in multiple iterations of loading the model, but eventually it can be used (or not, and needs to be reopened). In addition, if this problem occurs, prompt word completion will probably not be used. Before five hours ago, I never encountered this problem, but now it triggers almost all the time.I haven't changed the Settings or installed new plug-ins.The program is installed in docker.I'm kind of wondering why is this a problem.

Exception in callback H11Protocol.timeout_keep_alive_handler()
handle: <TimerHandle when=7379774.222416322 H11Protocol.timeout_keep_alive_handler()>
Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_state.py", line 249, in _fire_event_triggered_transitions
    new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type]
KeyError: <class 'h11._events.ConnectionClosed'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/conda/envs/sd/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 383, in timeout_keep_alive_handler
    self.conn.send(event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 493, in send_with_data_passthrough
    self._process_event(self.our_role, event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 242, in _process_event
    self._cstate.process_event(role, type(event), server_switch_event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_state.py", line 238, in process_event
    self._fire_event_triggered_transitions(role, event_type)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_state.py", line 251, in _fire_event_triggered_transitions
    raise LocalProtocolError(
h11._util.LocalProtocolError: can't handle event type ConnectionClosed when role=SERVER and state=SEND_RESPONSE
API error: GET: https://525f113fd2f59ddd54.gradio.live/file=/tmp/gradio/de5d9aab9c3a9cf7b2368bd5593f771ecf038c96/Martin Schongauer.jpg {'error': 'LocalProtocolError', 'detail': '', 'body': '', 'errors': "Can't send data when our state is ERROR"}
╭───────────────────────────────────────────────────────────────────────────────── Traceback (most recent call last) ──────────────────────────────────────────────────────────────────────────────────╮
│ /conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/errors.py:162 in __call__                                                                                                           │
│                                                                                                                                                                                                      │
│ /conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/base.py:109 in __call__                                                                                                             │
│                                                                                                                                                                                                      │
│                                                                                       ... 7 frames hidden ...                                                                                        │
│                                                                                                                                                                                                      │
│ /conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py:468 in send                                                                                                                           │
│                                                                                                                                                                                                      │
│   467 │   │   """                                                                                                                                                                                    │
│ ❱ 468 │   │   data_list = self.send_with_data_passthrough(event)                                                                                                                                     │
│   469 │   │   if data_list is None:                                                                                                                                                                  │
│                                                                                                                                                                                                      │
│ ╭───────────────────────────────────────────────────────────────────────────────────────────── locals ─────────────────────────────────────────────────────────────────────────────────────────────╮ │
│ │ event = Response(status_code=200, headers=<Headers([(b'date', b'Mon, 22 May 2023 19:57:46 GMT'), (b'server', b'uvicorn'), (b'accept-ranges', b'bytes'), (b'content-type', b'image/jpeg'),        │ │
│ │         (b'last-modified', b'Sun, 21 May 2023 18:57:08 GMT'), (b'etag', b'd3de113fba97994cb51338c528b67228'), (b'content-encoding', b'gzip'), (b'vary', b'Accept-Encoding'), (b'x-process-time', │ │
│ │         b'4.0266')])>, http_version=b'1.1', reason=b'OK')                                                                                                                                        │ │
│ │  self = <h11._connection.Connection object at 0x7f3a80ed4940>                                                                                                                                    │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                                                                                                                                      │
│ /conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py:483 in send_with_data_passthrough                                                                                                     │
│                                                                                                                                                                                                      │
│   482 │   │   if self.our_state is ERROR:                                                                                                                                                            │
│ ❱ 483 │   │   │   raise LocalProtocolError("Can't send data when our state is ERROR")                                                                                                                │
│   484 │   │   try:                                                                                                                                                                                   │
│                                                                                                                                                                                                      │
│ ╭───────────────────────────────────────────────────────────────────────────────────────────── locals ─────────────────────────────────────────────────────────────────────────────────────────────╮ │
│ │ event = Response(status_code=200, headers=<Headers([(b'date', b'Mon, 22 May 2023 19:57:46 GMT'), (b'server', b'uvicorn'), (b'accept-ranges', b'bytes'), (b'content-type', b'image/jpeg'),        │ │
│ │         (b'last-modified', b'Sun, 21 May 2023 18:57:08 GMT'), (b'etag', b'd3de113fba97994cb51338c528b67228'), (b'content-encoding', b'gzip'), (b'vary', b'Accept-Encoding'), (b'x-process-time', │ │
│ │         b'4.0266')])>, http_version=b'1.1', reason=b'OK')                                                                                                                                        │ │
│ │  self = <h11._connection.Connection object at 0x7f3a80ed4940>                                                                                                                                    │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
LocalProtocolError: Can't send data when our state is ERROR
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 428, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/conda/envs/sd/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/conda/envs/sd/lib/python3.10/site-packages/fastapi/applications.py", line 273, in __call__
    await super().__call__(scope, receive, send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/base.py", line 109, in __call__
    await response(scope, receive, send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 662, in __aexit__
    raise exceptions[0]
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/responses.py", line 273, in wrap
    await func()
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/base.py", line 134, in stream_response
    return await super().stream_response(send)
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/responses.py", line 255, in stream_response
    await send(
  File "/conda/envs/sd/lib/python3.10/site-packages/starlette/middleware/errors.py", line 159, in _send
    await send(message)
  File "/conda/envs/sd/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 512, in send
    output = self.conn.send(event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "/conda/envs/sd/lib/python3.10/site-packages/h11/_connection.py", line 483, in send_with_data_passthrough
    raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR
Applying cross attention optimization (Doggettx).
Textual inversion embeddings loaded(15): bad-artist, bad-artist-anime, bad-hands-5, bad-image-v2-39000, bad_pictures, bad_prompt_version2, badhandv4, BadNegAnatomyV1-neg, By bad artist -neg, EasyNegative, easynegative, EasyNegativeV2, ng_deepnegative_v1_75t, rmadanegative4_sd15-neg, verybadimagenegative_v1.3
Textual inversion embeddings skipped(1): 21charturnerv2
Model loaded in 46.6s (load weights from disk: 22.0s, create model: 1.3s, apply weights to model: 8.7s, apply half(): 7.9s, apply dtype to VAE: 0.1s, load VAE: 1.8s, move model to device: 4.2s, load textual inversion embeddings: 0.5s).
Loading weights [None] from /stable-diffusion-webui/models/Stable-diffusion/3d/chikmix_V3.safetensors
Creating model from config: /stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Loading VAE weights from commandline argument: /stable-diffusion-webui/models/VAE/vae-ft-mse-840000-ema-pruned.ckpt
Applying cross attention optimization (Doggettx).
Model loaded in 53.0s (load weights from disk: 24.2s, create model: 1.6s, apply weights to model: 8.6s, apply half(): 9.7s, load VAE: 2.2s, move model to device: 6.6s, load textual inversion embeddings: 0.1s).
Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/routes.py", line 408, in run_predict
    output = await app.get_blocks().process_api(
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/blocks.py", line 1315, in process_api
    result = await self.call_function(
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/blocks.py", line 1043, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/stable-diffusion-webui/modules/ui_extra_networks.py", line 296, in refresh
    pg.refresh()
  File "/stable-diffusion-webui/modules/ui_extra_networks_textual_inversion.py", line 13, in refresh
    sd_hijack.model_hijack.embedding_db.load_textual_inversion_embeddings(force_reload=True)
  File "/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 230, in load_textual_inversion_embeddings
    self.expected_shape = self.get_expected_shape()
  File "/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 137, in get_expected_shape
    vec = shared.sd_model.cond_stage_model.encode_embedding_init_text(",", 1)
  File "/stable-diffusion-webui/modules/sd_hijack_clip.py", line 315, in encode_embedding_init_text
    embedded = embedding_layer.token_embedding.wrapped(ids.to(embedding_layer.token_embedding.wrapped.weight.device)).squeeze(0)
  File "/conda/envs/sd/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1265, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'Embedding' object has no attribute 'wrapped'
Loading weights [None] from /stable-diffusion-webui/models/Stable-diffusion/3d/chikmix_V3.safetensors
Creating model from config: /stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Loading VAE weights from commandline argument: /stable-diffusion-webui/models/VAE/vae-ft-mse-840000-ema-pruned.ckpt
Applying cross attention optimization (Doggettx).
Model loaded in 39.8s (load weights from disk: 15.9s, create model: 1.2s, apply weights to model: 8.9s, apply half(): 5.2s, load VAE: 0.9s, move model to device: 7.6s, load textual inversion embeddings: 0.1s).
Traceback (most recent call last):
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/routes.py", line 408, in run_predict
    output = await app.get_blocks().process_api(
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/blocks.py", line 1315, in process_api
    result = await self.call_function(
  File "/conda/envs/sd/lib/python3.10/site-packages/gradio/blocks.py", line 1043, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/conda/envs/sd/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/stable-diffusion-webui/modules/ui_extra_networks.py", line 296, in refresh
    pg.refresh()
  File "/stable-diffusion-webui/modules/ui_extra_networks_textual_inversion.py", line 13, in refresh
    sd_hijack.model_hijack.embedding_db.load_textual_inversion_embeddings(force_reload=True)
  File "/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 230, in load_textual_inversion_embeddings
    self.expected_shape = self.get_expected_shape()
  File "/stable-diffusion-webui/modules/textual_inversion/textual_inversion.py", line 137, in get_expected_shape
    vec = shared.sd_model.cond_stage_model.encode_embedding_init_text(",", 1)
  File "/stable-diffusion-webui/modules/sd_hijack_clip.py", line 315, in encode_embedding_init_text
    embedded = embedding_layer.token_embedding.wrapped(ids.to(embedding_layer.token_embedding.wrapped.weight.device)).squeeze(0)
  File "/conda/envs/sd/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1265, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'Embedding' object has no attribute 'wrapped'
Loading weights [None] from /stable-diffusion-webui/models/Stable-diffusion/3d/chikmix_V3.safetensors
Creating model from config: /stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Loading VAE weights from commandline argument: /stable-diffusion-webui/models/VAE/vae-ft-mse-840000-ema-pruned.ckpt
Applying cross attention optimization (Doggettx).
Model loaded in 48.2s (load weights from disk: 15.7s, create model: 1.3s, apply weights to model: 8.7s, apply half(): 10.8s, load VAE: 1.5s, move model to device: 10.1s, load textual inversion embeddings: 0.2s).

txt2img: girls
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:08<00:00,  2.27it/s]
Total progress: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:03<00:00,  5.14it/s]
Total progress: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:03<00:00,  `6.23it/s]

Your problem log not replated to same issue, seems to you api call a wrong prompts...

@chriswhite5918
Copy link

Same problem here.

@cooperdk
Copy link

I get this if I run in API mode. Disabling API while doing what you need to (fx updating extensions) will fix it.

But I had it working so I could update without disabling the API, so I have no idea what went wrong here.

Developer needs to fix his update functions, cause it must work no matter your configuration

@zhugelaing888
Copy link

pip install fastapi[all] --force-reinstall

@super3
Copy link

super3 commented Jun 27, 2023

@AUTOMATIC1111 We authored @montyanderson this commit based on a user's suggestion. It doesn't not fix the listed issue.

Upgrading to any Automatic1111 version past January introduces this error, which occurs randomly ~3% of the time. Per a discussion with @vladmandic the Gunicorn timeout settings don't get properly passed through via this method so that commit does nothing. Any ideas?

@super3
Copy link

super3 commented Jun 27, 2023

@AUTOMATIC1111 Can you reopen this issue?

@Xynonners
Copy link

still happening on my end aswell.

@tangjicheng1
Copy link
Contributor

still happening on my end aswell.

I think you should add --nowebui when launch.

@CamiloMM
Copy link

CamiloMM commented Sep 17, 2023

If someone is having this bug:

  1. disable all extensions
  2. enable them back

I did this between restarts thinking a single plugin was the culprit, dunno if restarts are necessary. I enable all the plugins I had enabled (taking a screenshot to remember which ones), turns out just disabling/enabling fixed it.

@thulle
Copy link

thulle commented Sep 22, 2023

I think you should add --nowebui when launch.

I'm using A1111 with a discord bot. I don't have any plugins installed at all, and I could easily trigger this by queuing api-requests with different checkpoints.
Starting with the commandline arguments:

--listen --api --nowebui --port 7860 --timeout-keep-alive 60

Seems to've solved it, more specifically I think it's the nowebui flag that @tangjicheng46 mentioned that solved it for me. Queued 4 requests with different checkpoints without issue.

@Hulkninja
Copy link

Having this issue as well when using this software to generate pictures to send to another site.
Encountering sometimes results in the process freezing until I select the cmd window and press a key on my keyboard.

I've had a single instance where it managed to generate 100+ images without issue, and I've had instances where the very first image caused the issue, interrupting the process and necessitating starting the process all over again.

Gonna try some of the fixes mentioned here but I'm not hoping for too much...

@super3
Copy link

super3 commented Jan 17, 2024 via email

@GitHub1712
Copy link

GitHub1712 commented Feb 8, 2024

I had the "Can't send data when our state is ERROR" error on any version for a long time, if doing intense calculations in extensions, even if those are not using the api. Seems the api does not like those calculations, because it would block it to answer requests.
I solved it by disabling the api and implementing an own alternative.
I am sure I made bad mistakes but it runs in another thread, could run in multiple instances and ports, with and without ssl and everything is working well, even from js, the problem is solved and whole webui seems to run smoother. And it has a turbo get request, just type adress?pompt in browser to get a turbo image.
I only implemented basic functions I need but original api functions could be added easy if someone likes the architecture.
I placed in extensions/miniapi/miniapi.py. Port and SSL are hard coded on top, the bindings and api code is below the functions.

import threading
from flask import Flask, request, jsonify, Blueprint
from waitress import serve
from OpenSSL import SSL
import os
from PIL import Image
import io
import base64
from modules import shared,scripts, sd_samplers
from modules.call_queue import queue_lock
from modules.processing import StableDiffusionProcessingImg2Img, StableDiffusionProcessingTxt2Img, process_images
from modules.shared import opts
from contextlib import closing
from modules.progress import create_task_id, add_task_to_queue, start_task, finish_task
import gradio as gr
from fastapi.exceptions import HTTPException
import requests
from io import BytesIO
import ipaddress
import sys
from modules.sd_models_config import find_checkpoint_config_near_filename


context = SSL.Context(SSL.SSLv23_METHOD)
cer = 'webui.cert'
key = 'webui.key'
context.use_privatekey_file(key)
context.use_certificate_file(cer)
bp = Blueprint('main', __name__)

port=5000
host="0.0.0.0"
SSL = True


def script_name_to_index(name, scripts):
    try:
        return [script.title().lower() for script in scripts].index(name.lower())
    except Exception as e:
        pass

def get_selectable_script( script_name, script_runner):
    if script_name is None or script_name == "":
        return None, None
    script_idx = script_name_to_index(script_name, script_runner.selectable_scripts)
    script = script_runner.selectable_scripts[script_idx]
    return script, script_idx


def get_script_info():
    res = []
    for script_list in [scripts.scripts_txt2img.scripts, scripts.scripts_img2img.scripts]:
        res += [script.api_info for script in script_list if script.api_info is not None]
    return res

def get_script( script_name, script_runner):
    if script_name is None or script_name == "":
        return None, None
    script_idx = script_name_to_index(script_name, script_runner.scripts)
    return script_runner.scripts[script_idx]

def init_default_script_args( script_runner):
    last_arg_index = 1
    for script in script_runner.scripts:
        if last_arg_index < script.args_to:
            last_arg_index = script.args_to
    script_args = [None]*last_arg_index
    script_args[0] = 0

    with gr.Blocks(): # will throw errors calling ui function without this
        for script in script_runner.scripts:
            if script.ui(script.is_img2img):
                ui_default_values = []
                for elem in script.ui(script.is_img2img):
                    ui_default_values.append(elem.value)
                script_args[script.args_from:script.args_to] = ui_default_values
    return script_args

def init_script_args( request, default_script_args, selectable_scripts, selectable_idx, script_runner, *, input_script_args=None):
    script_args = default_script_args.copy()
    if selectable_scripts:
        script_args[selectable_scripts.args_from:selectable_scripts.args_to] = request.script_args
        script_args[0] = selectable_idx + 1
    return script_args


def encode_pil_to_base64(image):
    image.convert('RGB')
    data = io.BytesIO()
    image.save(data, "PNG")
    return base64.b64encode(data.getvalue()).decode('utf-8')


class AttrDict(dict):
    def __init__(self, *args, **kwargs):
        super(AttrDict, self).__init__(*args, **kwargs)
        self.__dict__ = self

    def copy(self, update=None):
        new_dict = AttrDict(self)
        if update:
            new_dict.update(update)
        return new_dict


def validate_sampler_name(name):
    config = sd_samplers.all_samplers_map.get(name, None)
    if config is None:
        pass
        # raise HTTPException(status_code=404, detail="Sampler not found")

    return name

def verify_url(url):
    """Returns True if the url refers to a global resource."""
    import socket
    from urllib.parse import urlparse
    try:
        parsed_url = urlparse(url)
        domain_name = parsed_url.netloc
        host = socket.gethostbyname_ex(domain_name)
        for ip in host[2]:
            ip_addr = ipaddress.ip_address(ip)
            if not ip_addr.is_global:
                return False
    except Exception:
        return False
    return True


        
def get_sd_models():
    import modules.sd_models as sd_models
    return [{"title": x.title, "model_name": x.model_name, "hash": x.shorthash, "sha256": x.sha256, "filename": x.filename, "config": find_checkpoint_config_near_filename(x)} for x in sd_models.checkpoints_list.values()]


def get_checkpoints():
    return get_sd_models()


def refresh_checkpoints():
    shared.refresh_checkpoints()
    return '', 200, {
    'Access-Control-Allow-Origin': '*',
    'Access-Control-Allow-Methods': 'POST, OPTIONS',
    'Access-Control-Allow-Headers': 'Content-Type',
    }


from modules import sd_models
def reload_checkpoint():
    sd_models.send_model_to_device(shared.sd_model)
    return '', 200, {
    'Access-Control-Allow-Origin': '*',
    'Access-Control-Allow-Methods': 'POST, OPTIONS',
    'Access-Control-Allow-Headers': 'Content-Type',
    }

def set_checkpoint():
    req = AttrDict( request.json)
    checkpoint_name = req.get("sd_model_checkpoint", None)
    jsname = req.get("jsname", None)
    if checkpoint_name is not None and checkpoint_name not in sd_models.checkpoint_aliases:
        raise RuntimeError(f"model {checkpoint_name!r} not found")
    shared.opts.set("sd_model_checkpoint", checkpoint_name, is_api=True)
    shared.opts.data[jsname + '_sd_model_checkpoint'] = checkpoint_name
    shared.opts.data['extend' + '_sd_model_checkpoint'] = checkpoint_name
    shared.opts.save(shared.config_filename)
    return checkpoint_name


def decode_base64_to_imagecopilot(base64_string):
    base64_string = base64_string.split(",")[1]  # Remove the "data:image/png;base64," part
    image_bytes = base64.b64decode(base64_string)
    image = Image.open(io.BytesIO(image_bytes))
    return image


def decode_base64_to_image(encoding):
    if encoding.startswith("http://") or encoding.startswith("https://"):
        if not opts.api_enable_requests:
            raise HTTPException(status_code=500, detail="Requests not allowed")
        if opts.api_forbid_local_requests and not verify_url(encoding):
            raise HTTPException(status_code=500, detail="Request to local resource not allowed")
        headers = {'user-agent': opts.api_useragent} if opts.api_useragent else {}
        response = requests.get(encoding, timeout=30, headers=headers)
        try:
            image = Image.open(BytesIO(response.content))
            return image
        except Exception as e:
            raise HTTPException(status_code=500, detail="Invalid image url") from e
    if encoding.startswith("data:image/"):
        encoding = encoding.split(";")[1].split(",")[1]
    try:
        image = Image.open(BytesIO(base64.b64decode(encoding)))
        return image
    except Exception as e:
        raise HTTPException(status_code=500, detail="Invalid encoded image") from e

from modules.api import models
from flask import send_file

def txt2imgapi( txt2imgreq: models.StableDiffusionTxt2ImgProcessingAPI):
    txt2imgreq = AttrDict(txt2imgreq)
    task_id = create_task_id("txt2img")
    script_runner = scripts.scripts_txt2img
    populate = txt2imgreq.copy(update={
        "sampler_name": validate_sampler_name(txt2imgreq.sampler_name or txt2imgreq.sampler_index),
        "do_not_save_samples": not txt2imgreq.save_images,
        "do_not_save_grid": not txt2imgreq.save_images,
    })
    if populate.sampler_name:
        populate.sampler_index = None  # prevent a warning later on
    args = vars(populate)
    args.pop('script_name', None)
    args.pop('script_args', None) # will refeed them to the pipeline directly after initializing them
    args.pop('alwayson_scripts', None)
    args.pop('infotext', None)
    script_args = init_script_args(txt2imgreq, init_default_script_args(script_runner), None, None, script_runner, input_script_args=None)
    args.pop('save_images', None)
    add_task_to_queue(task_id)
    with queue_lock:
        with closing(StableDiffusionProcessingTxt2Img(sd_model=shared.sd_model, **args)) as p:
            p.is_api = True
            p.scripts = script_runner
            p.outpath_grids = opts.outdir_txt2img_grids
            p.outpath_samples = opts.outdir_txt2img_samples

            try:
                shared.state.begin(job="scripts_txt2img")
                start_task(task_id)
                p.script_args = tuple(script_args) # Need to pass args as tuple here
                processed = process_images(p)
                finish_task(task_id)
            finally:
                shared.state.end()
                shared.total_tqdm.clear()
    img = processed.images[0]
    byte_arr = io.BytesIO()
    img.save(byte_arr, format='PNG')
    byte_arr.seek(0)
    return send_file(byte_arr, mimetype='image/png')


def txt2imgturbo():
    first_arg = list(request.args.items())[0]
    data = {
        "enable_hr": False,
        "denoising_strength": 1.0,
        "firstphase_width": 0,
        "firstphase_height": 0,
        "hr_scale": 2,
        "prompt": str(first_arg[0]),
        "negative_prompt": "",
        "seed": -1,
        "subseed": -1,
        "subseed_strength": 0,
        "seed_resize_from_h": -1,
        "seed_resize_from_w": -1,
        "batch_size": 1,
        "n_iter": 1,
        "steps": 1,
        "cfg_scale":  1.0,
        "width": 512,
        "height": 512,
        "restore_faces": False,
        "tiling": False,
        "do_not_save_samples": False,
        "do_not_save_grid": False,
        "negative_prompt": "",
        "eta": 0,
        "s_min_uncond": 0,
        "s_churn": 0,
        "s_tmax": 0,
        "s_tmin": 0,
        "s_noise": 1,
        "override_settings": {},
        "override_settings_restore_afterwards": True,
        "script_args": [],
        "sampler_name": "Euler a",
        "sampler_index": "Euler a",
        "save_images": False,
        "alwayson_scripts": {}
    }

    checkpoint_name="basic\\x-Special\\sd_xl_turbo_1.0_fp16.safetensors"
    shared.opts.set("sd_model_checkpoint", checkpoint_name, is_api=True)
    shared.opts.data['turbo' + '_sd_model_checkpoint'] = checkpoint_name
    shared.opts.save(shared.config_filename)
    return txt2imgapi(data)


def img2imgapi( ):
    img2imgreq = AttrDict( request.json)
    task_id = create_task_id("img2img")
    init_images = img2imgreq.init_images
    mask = img2imgreq.mask
    if mask:
        mask = decode_base64_to_image(mask)
    img2img_script_runner = scripts.scripts_img2img
    selectable_scripts, selectable_script_idx = None,None#get_selectable_script(img2imgreq.script_name, script_runner)
    populate = img2imgreq.copy(update={  # Override __init__ params
        "sampler_name": validate_sampler_name(img2imgreq.sampler_name or img2imgreq.sampler_index),
        "do_not_save_samples": True,#not img2imgreq.save_images,
        "do_not_save_grid": True,#not img2imgreq.save_images,
        "mask": mask,
    })
    if populate.sampler_name:
        populate.sampler_index = None  # prevent a warning later on
    args = vars(populate)#vars(dict(populate))
    args.pop('include_init_images', None)  # this is meant to be done by "exclude": True in model, but it's for a reason that I cannot determine.
    args.pop('script_name', None)
    args.pop('script_args', None)  # will refeed them to the pipeline directly after initializing them
    args.pop('alwayson_scripts', None)
    args.pop('infotext', None)
    script_args = init_script_args(img2imgreq, init_default_script_args(img2img_script_runner), selectable_scripts, selectable_script_idx, img2img_script_runner, input_script_args='')
    args.pop('save_images', None)
    add_task_to_queue(task_id)

    with queue_lock:
        with closing(StableDiffusionProcessingImg2Img(sd_model=shared.sd_model, **args)) as p:
            p.init_images = [decode_base64_to_image(x) for x in init_images]
            p.is_api = True
            p.scripts = img2img_script_runner
            p.outpath_grids = opts.outdir_img2img_grids
            p.outpath_samples = opts.outdir_img2img_samples
            try:
                shared.state.begin(job="scripts_img2img")
                start_task(task_id)
                if selectable_scripts is not None:
                    p.script_args = script_args
                    processed = scripts.scripts_img2img.run(p, *p.script_args) # Need to pass args as list here
                else:
                    p.script_args = tuple(script_args) # Need to pass args as tuple here
                    processed = process_images(p)
                finish_task(task_id)
            finally:
                shared.state.end()
                shared.total_tqdm.clear()
    if not img2imgreq.include_init_images:
        img2imgreq['init_images'] = None
        img2imgreq['mask'] = None
    return encode_pil_to_base64(processed.images[0])


def options():
    return '', 200, {
        'Access-Control-Allow-Origin': '*',
        'Access-Control-Allow-Methods': 'POST, OPTIONS',
        'Access-Control-Allow-Headers': 'Content-Type',
    }

bp.route('/sdapi/v1/img2img', methods=['OPTIONS'])(options)

bp.route('/sdapi/v1/txt2imgturbo', methods=['GET'])(txt2imgturbo)
bp.route('/sdapi/v1/txt2img', methods=['POST'])(txt2imgapi)
bp.route('/sdapi/v1/img2img', methods=['POST'])(img2imgapi)
bp.route('/sdapi/v1/options', methods=['POST'])(set_checkpoint)
bp.route('/sdapi/v1/reload-checkpoint', methods=['POST'])(reload_checkpoint)

from flask_cors import CORS

def create_app():
    app = Flask(__name__)
    # CORS(app)  # Add this line
    CORS(app, resources={r"/*": {"origins": "*"}})  # Allow any origin
    app.register_blueprint(bp)
    return app

os.environ['ENV'] = 'production'
app = create_app()
sys.stderr = open(os.devnull, 'w')
# # sys.stdout = open(os.devnull, 'w')
def run_flask():
    if SSL:
        app.run(host=host, port=port, ssl_context=(cer, key), debug=False)
    else:
        app.run(host=host, port=port, debug=False)

print("Starting Flask server api on port: " + str(port))
flask_thread = threading.Thread(target=run_flask)
flask_thread.start()







Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-report Report of a bug, yet to be confirmed
Projects
None yet
Development

No branches or pull requests

16 participants