Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Displaying uints in cmp.Diff output in decimal notation rather than hexadecimal #219

Closed
EivindSt opened this issue Jun 15, 2020 · 1 comment

Comments

@EivindSt
Copy link

When comparing uint slices with cmp.Diff the output displays differences as hexadecimal strings. While this is easily readable for e.g. bytes, it can be confusing when debugging larger numbers.
It would be nice to have options to control the format used to output numbers.

E.g. the following test:

func TestCmpDiffUint(t *testing.T) {
	if diff := cmp.Diff([]uint{0, 1, 2, 128}, []uint{1, 1, 2, 127}); diff != "" {
		t.Fatalf(diff)
	}
}

produces the output:

--- FAIL: TestCmpDiffUint (0.00s)
    cmp_test.go:11:   []uint{
        - 	0x00, 0x01, 0x02, 0x80,
        + 	0x01, 0x01, 0x02, 0x7f,
          }

In comparison, when comparing []int slices with the following test:

func TestCmpDiffInt(t *testing.T) {
	if diff := cmp.Diff([]int{0, 1, 2, 128}, []int{1, 1, 2, 127}); diff != "" {
		t.Fatalf(diff)
	}
}

the following output is produced:

--- FAIL: TestCmpDiffInt (0.00s)
    cmp_test.go:17:   []int{
        - 	0, 1, 2, 128,
        + 	1, 1, 2, 127,
          }

which is arguably more readable.

@dsnet
Copy link
Collaborator

dsnet commented Jun 15, 2020

Thanks for the report. This is already fixed by #199

@dsnet dsnet closed this as completed Jun 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants