You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Getting the error below when I attempt to run python inference.py --input_dir examples/ --output_dir output/
/mnt/asabet/IntrinsicCompositing/intrinsic_compositing/shading/pipeline.py:108: UserWarning: Using a target size (torch.Size([169247, 352])) that is different to the input size (torch.Size([169247])). This will likely lead to incorrect results due to broadcasting. Please ensure they have the same size. loss = torch.nn.functional.mse_loss(pred_shd.reshape(-1), b) Traceback (most recent call last): File "/mnt/asabet/IntrinsicCompositing/inference/inference.py", line 201, in <module> coeffs, lgt_vis = get_light_coeffs( File "/mnt/asabet/IntrinsicCompositing/intrinsic_compositing/shading/pipeline.py", line 138, in get_light_coeffs init_loss = test_init(params, A, b) File "/mnt/asabet/IntrinsicCompositing/intrinsic_compositing/shading/pipeline.py", line 108, in test_init loss = torch.nn.functional.mse_loss(pred_shd.reshape(-1), b) File "/mnt/asabet/IntrinsicCompositing/.venv/lib/python3.10/site-packages/torch/nn/functional.py", line 3365, in mse_loss expanded_input, expanded_target = torch.broadcast_tensors(input, target) File "/mnt/asabet/IntrinsicCompositing/.venv/lib/python3.10/site-packages/torch/functional.py", line 76, in broadcast_tensors return _VF.broadcast_tensors(tensors) # type: ignore[attr-defined] RuntimeError: The size of tensor a (169247) must match the size of tensor b (352) at non-singleton dimension 1
If I fix this bug by reshaping the pred_shd tensor, I run into errors elsewhere, ie
File "/mnt/asabet/IntrinsicCompositing/intrinsic_compositing/shading/pipeline.py", line 153, in get_light_coeffs return coeffs, out_shd.reshape(shd.shape) ValueError: cannot reshape array of size 174592 into shape (512,341,352)
Code would be easier to execute if exact dependencies were pinned. ie there are breaking changes made to https://github.com/compphoto/intrinsic, which have to be manually set to a specific commit. Also unclear which version of torch to use, etc.
The text was updated successfully, but these errors were encountered:
Hi, sorry for the delay. I've been meaning to fix this, so thank you for bringing it up.
For now, I've pinned the version of the intrinsic decomposition pipeline to a specific commit before we updated the method. It should work if you run the pip installation command after pulling this change. As for the version s of libraries I used, it's not super strict and should work with the latest versions of pytorch, numpy, etc. I can add information in the repo for which versions are reasonable though.
Getting the error below when I attempt to run
python inference.py --input_dir examples/ --output_dir output/
/mnt/asabet/IntrinsicCompositing/intrinsic_compositing/shading/pipeline.py:108: UserWarning: Using a target size (torch.Size([169247, 352])) that is different to the input size (torch.Size([169247])). This will likely lead to incorrect results due to broadcasting. Please ensure they have the same size. loss = torch.nn.functional.mse_loss(pred_shd.reshape(-1), b) Traceback (most recent call last): File "/mnt/asabet/IntrinsicCompositing/inference/inference.py", line 201, in <module> coeffs, lgt_vis = get_light_coeffs( File "/mnt/asabet/IntrinsicCompositing/intrinsic_compositing/shading/pipeline.py", line 138, in get_light_coeffs init_loss = test_init(params, A, b) File "/mnt/asabet/IntrinsicCompositing/intrinsic_compositing/shading/pipeline.py", line 108, in test_init loss = torch.nn.functional.mse_loss(pred_shd.reshape(-1), b) File "/mnt/asabet/IntrinsicCompositing/.venv/lib/python3.10/site-packages/torch/nn/functional.py", line 3365, in mse_loss expanded_input, expanded_target = torch.broadcast_tensors(input, target) File "/mnt/asabet/IntrinsicCompositing/.venv/lib/python3.10/site-packages/torch/functional.py", line 76, in broadcast_tensors return _VF.broadcast_tensors(tensors) # type: ignore[attr-defined] RuntimeError: The size of tensor a (169247) must match the size of tensor b (352) at non-singleton dimension 1
If I fix this bug by reshaping the pred_shd tensor, I run into errors elsewhere, ie
File "/mnt/asabet/IntrinsicCompositing/intrinsic_compositing/shading/pipeline.py", line 153, in get_light_coeffs return coeffs, out_shd.reshape(shd.shape) ValueError: cannot reshape array of size 174592 into shape (512,341,352)
Code would be easier to execute if exact dependencies were pinned. ie there are breaking changes made to
https://github.com/compphoto/intrinsic
, which have to be manually set to a specific commit. Also unclear which version of torch to use, etc.The text was updated successfully, but these errors were encountered: