Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Validating Skin weights encoded with normalized UBYTE are reported as errors. #132

Closed
vpenades opened this issue Jan 24, 2020 · 4 comments
Closed

Comments

@vpenades
Copy link

I am writing some gltfs with skin joint weight vertex attributes. To save memory I am writing the WEIGHTS_0 attribute using Normalized UBYTE.

All the original weights are checked that their sum equals 1.

But when encoded to Normalized UBYTE, gltf validator reports these errors:

Errors:
		/meshes/0/primitives/1/attributes/WEIGHTS_0: Weights accessor elements (at indices 0..3) have non-normalized sum: 0.9960784316062927.
		/meshes/0/primitives/1/attributes/WEIGHTS_0: Weights accessor elements (at indices 4..7) have non-normalized sum: 0.9960784316062927.
		/meshes/0/primitives/0/attributes/WEIGHTS_0: Weights accessor elements (at indices 8..11) have non-normalized sum: 0.9960784316062927.
		/meshes/0/primitives/0/attributes/WEIGHTS_0: Weights accessor elements (at indices 12..15) have non-normalized sum: 0.9960784316062927.
		/meshes/0/primitives/1/attributes/WEIGHTS_0: Weights accessor elements (at indices 16..19) have non-normalized sum: 0.9960784316062927.
		/meshes/0/primitives/0/attributes/WEIGHTS_0: Weights accessor elements (at indices 20..23) have non-normalized sum: 0.9960784316062927.
		/meshes/0/primitives/1/attributes/WEIGHTS_0: Weights accessor elements (at indices 24..27) have non-normalized sum: 0.9960784316062927.
		/meshes/0/primitives/0/attributes/WEIGHTS_0: Weights accessor elements (at indices 28..31) have non-normalized sum: 0.9960784316062927.

All the values are around 0.996... , So I pressume this is happening due to unavoidable precission loss.

I would suggest that gltf validator to adjust the allowed error, based on the precission on the encoding.

@lexaknyazev
Copy link
Member

The threshold for weights sum is defined as 2e-7 * weightCount. This value seems to be enough to cover unorm8 values based on the output of meshoptimizer quantization tool.

Could you please share an asset both with original and quantized weights?

@vpenades
Copy link
Author

the original weights are:
0.5 , 0,5, 0, 0
Quantized:
127, 127, 0, 0

the quantized sum would be:
(127 + 127 + 0 + 0) / 255 = 0.996078431

Which clearly has a much larger error than 2e-7

I understand that the sum can be made with 127 and 128, which would made the total sum closer to 1, but I don't think what's more correct, to have both weights even, but undershoot, or have one weight undershoot, and the other one overshoot.

@lexaknyazev
Copy link
Member

what's more correct, to have both weights even, but undershoot, or have one weight undershoot, and the other one overshoot

As @zeux said here:

gltfpack renormalizes the skinning weights after quantization. This is necessary because if the weights don't add up to exactly 1 (which happens trivially when quantizing individual weights), vertices can be deformed very significantly because the weight error will get multiplied by the joint position, so an error of 1/255.f in the weight sum is actually unacceptably large

@vpenades
Copy link
Author

So, if I understand correctly, when dealing with quantized weights, the total sum of raw data should be 255 or 65535 respectively, is that right?

If that's the case, this new detail should go in the official specification.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants