Error occurs when using different Fooocus model in colab #3666
Unanswered
ARTURIUS94
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
in google colab when I try to generate an image using another model in Fooocus
then the generation freezes at the stage "Loading models ..."
and then disconnects and gives an error:
"error
![08-10-2024 23-07-36](https://private-user-images.githubusercontent.com/156925524/374712038-9241f4c1-39da-4005-898c-5b1b4e7889b2.jpg?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkzMzA2NDksIm5iZiI6MTczOTMzMDM0OSwicGF0aCI6Ii8xNTY5MjU1MjQvMzc0NzEyMDM4LTkyNDFmNGMxLTM5ZGEtNDAwNS04OThjLTViMWI0ZTc4ODliMi5qcGc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjUwMjEyJTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI1MDIxMlQwMzE5MDlaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT0yNjJjZWRiNTBiMzg3YjVmNzNlOTE0MWEwY2RkOGE4YzliMWE0Y2Q1NzdlMTA1MjEyOWRlYzE1ZTQ1MmFjZGQ2JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCJ9.rto4B1RVj2MBGSvDyObIyGBWhuPeP3MRwmK8zQ4P7U0)
Unexpected token '<', " <!DOCTYPE "... is not valid JSON"
the model that is by default in Fooocus -
![08-10-2024 23-18-23](https://private-user-images.githubusercontent.com/156925524/374712265-9c08e216-1ad7-4284-be98-3469cde5d335.jpg?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkzMzA2NDksIm5iZiI6MTczOTMzMDM0OSwicGF0aCI6Ii8xNTY5MjU1MjQvMzc0NzEyMjY1LTljMDhlMjE2LTFhZDctNDI4NC1iZTk4LTM0NjljZGU1ZDMzNS5qcGc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjUwMjEyJTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI1MDIxMlQwMzE5MDlaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT03NjhjMTY3MTdjNDc4MGY2NzA1OTM2NjQ3MWZiMWRhMGE4ODk3ODBjOGNkYTY5YWMyMWNiMmFjMWQ2ODc2OGFlJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCJ9.frk9MRYDtb-ude8Dlu1b__U7bLC11sfI3J5Gd0AFifg)
"juggernautXL_v8Rundiffusion.safetensors" works fine
but when I use another model, an error pops up.
help me how to fix this?!?!?!?
Beta Was this translation helpful? Give feedback.
All reactions