Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

4 test failures on OpenBSD x86_64 (chrono-test, gtest-extra-test, xchar-test, posix-mock-test) #3670

Closed
seanm opened this issue Oct 6, 2023 · 16 comments

Comments

@seanm
Copy link
Contributor

seanm commented Oct 6, 2023

I tried building current master (f76603f) on OpenBSD 7.3 on x86_64. It built. There were 4 test failures however. Verbose output below:

The following tests FAILED:
	  3 - chrono-test (Failed)
	  6 - gtest-extra-test (Failed)
	 17 - xchar-test (Failed)
	 19 - posix-mock-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.
*** Error 8 in /home/builder/external/fmt-bin (Makefile:91 'test': /usr/local/bin/ctest --force-new-ctest-process --exclude-regex "CMake.Fil...)


kartikeya$ ctest -R chrono-test -V
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
Test project /home/builder/external/fmt-bin
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 3
    Start 3: chrono-test

3: Test command: /home/builder/external/fmt-bin/bin/chrono-test
3: Working Directory: /home/builder/external/fmt-bin/test
3: Test timeout computed to be: 10000000
3: [==========] Running 31 tests from 1 test suite.
3: [----------] Global test environment set-up.
3: [----------] 31 tests from chrono_test
3: [ RUN      ] chrono_test.format_tm
3: [       OK ] chrono_test.format_tm (364 ms)
3: [ RUN      ] chrono_test.format_tm_future
3: [       OK ] chrono_test.format_tm_future (0 ms)
3: [ RUN      ] chrono_test.format_tm_past
3: [       OK ] chrono_test.format_tm_past (0 ms)
3: [ RUN      ] chrono_test.grow_buffer
3: [       OK ] chrono_test.grow_buffer (0 ms)
3: [ RUN      ] chrono_test.format_to_empty_container
3: [       OK ] chrono_test.format_to_empty_container (0 ms)
3: [ RUN      ] chrono_test.empty_result
3: [       OK ] chrono_test.empty_result (0 ms)
3: [ RUN      ] chrono_test.gmtime
3: [       OK ] chrono_test.gmtime (0 ms)
3: [ RUN      ] chrono_test.system_clock_time_point
3: [       OK ] chrono_test.system_clock_time_point (0 ms)
3: [ RUN      ] chrono_test.format_default
3: [       OK ] chrono_test.format_default (0 ms)
3: [ RUN      ] chrono_test.duration_align
3: [       OK ] chrono_test.duration_align (0 ms)
3: [ RUN      ] chrono_test.tm_align
3: [       OK ] chrono_test.tm_align (0 ms)
3: [ RUN      ] chrono_test.tp_align
3: [       OK ] chrono_test.tp_align (0 ms)
3: [ RUN      ] chrono_test.format_specs
3: [       OK ] chrono_test.format_specs (0 ms)
3: [ RUN      ] chrono_test.invalid_specs
3: [       OK ] chrono_test.invalid_specs (3 ms)
3: [ RUN      ] chrono_test.locale
3: ja_JP.utf8 locale is missing.
3: [       OK ] chrono_test.locale (1 ms)
3: [ RUN      ] chrono_test.format_default_fp
3: [       OK ] chrono_test.format_default_fp (0 ms)
3: [ RUN      ] chrono_test.format_precision
3: [       OK ] chrono_test.format_precision (0 ms)
3: [ RUN      ] chrono_test.format_full_specs
3: [       OK ] chrono_test.format_full_specs (0 ms)
3: [ RUN      ] chrono_test.format_simple_q
3: [       OK ] chrono_test.format_simple_q (0 ms)
3: [ RUN      ] chrono_test.format_precision_q
3: [       OK ] chrono_test.format_precision_q (0 ms)
3: [ RUN      ] chrono_test.format_full_specs_q
3: [       OK ] chrono_test.format_full_specs_q (0 ms)
3: [ RUN      ] chrono_test.invalid_width_id
3: [       OK ] chrono_test.invalid_width_id (0 ms)
3: [ RUN      ] chrono_test.invalid_colons
3: [       OK ] chrono_test.invalid_colons (0 ms)
3: [ RUN      ] chrono_test.negative_durations
3: [       OK ] chrono_test.negative_durations (0 ms)
3: [ RUN      ] chrono_test.special_durations
3: [       OK ] chrono_test.special_durations (0 ms)
3: [ RUN      ] chrono_test.unsigned_duration
3: [       OK ] chrono_test.unsigned_duration (0 ms)
3: [ RUN      ] chrono_test.weekday
3: /home/builder/external/fmt/test/chrono-test.cc:755: Failure
3: Value of: (std::vector<std::string>{"пн", "Пн", "пнд", "Пнд"})
3: Expected: contains at least one element that is equal to "Mon"
3:   Actual: { "\xD0\xBF\xD0\xBD"
3:     As Text: "пн", "\xD0\x9F\xD0\xBD"
3:     As Text: "Пн", "\xD0\xBF\xD0\xBD\xD0\xB4"
3:     As Text: "пнд", "\xD0\x9F\xD0\xBD\xD0\xB4"
3:     As Text: "Пнд" }
3: /home/builder/external/fmt/test/chrono-test.cc:757: Failure
3: Value of: (std::vector<std::string>{"пн", "Пн", "пнд", "Пнд"})
3: Expected: contains at least one element that is equal to "Mon"
3:   Actual: { "\xD0\xBF\xD0\xBD"
3:     As Text: "пн", "\xD0\x9F\xD0\xBD"
3:     As Text: "Пн", "\xD0\xBF\xD0\xBD\xD0\xB4"
3:     As Text: "пнд", "\xD0\x9F\xD0\xBD\xD0\xB4"
3:     As Text: "Пнд" }
3: [  FAILED  ] chrono_test.weekday (6 ms)
3: [ RUN      ] chrono_test.cpp20_duration_subsecond_support
3: [       OK ] chrono_test.cpp20_duration_subsecond_support (0 ms)
3: [ RUN      ] chrono_test.timestamps_ratios
3: [       OK ] chrono_test.timestamps_ratios (0 ms)
3: [ RUN      ] chrono_test.timestamps_sub_seconds
3: [       OK ] chrono_test.timestamps_sub_seconds (0 ms)
3: [ RUN      ] chrono_test.glibc_extensions
3: [       OK ] chrono_test.glibc_extensions (0 ms)
3: [----------] 31 tests from chrono_test (379 ms total)
3:
3: [----------] Global test environment tear-down
3: [==========] 31 tests from 1 test suite ran. (381 ms total)
3: [  PASSED  ] 30 tests.
3: [  FAILED  ] 1 test, listed below:
3: [  FAILED  ] chrono_test.weekday
3:
3:  1 FAILED TEST
1/1 Test #3: chrono-test ......................***Failed    0.40 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   0.42 sec

The following tests FAILED:
	  3 - chrono-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.


kartikeya$ ctest -R gtest-extra-test -V
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
Test project /home/builder/external/fmt-bin
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 6
    Start 6: gtest-extra-test

6: Test command: /home/builder/external/fmt-bin/bin/gtest-extra-test
6: Working Directory: /home/builder/external/fmt-bin/test
6: Test timeout computed to be: 10000000
6: [==========] Running 23 tests from 3 test suites.
6: [----------] Global test environment set-up.
6: [----------] 6 tests from single_evaluation_test
6: [ RUN      ] single_evaluation_test.failed_expect_throw_msg
6: [       OK ] single_evaluation_test.failed_expect_throw_msg (3 ms)
6: [ RUN      ] single_evaluation_test.failed_expect_system_error
6: [       OK ] single_evaluation_test.failed_expect_system_error (1 ms)
6: [ RUN      ] single_evaluation_test.exception_tests
6: [       OK ] single_evaluation_test.exception_tests (0 ms)
6: [ RUN      ] single_evaluation_test.system_error_tests
6: [       OK ] single_evaluation_test.system_error_tests (0 ms)
6: [ RUN      ] single_evaluation_test.failed_expect_write
6: [       OK ] single_evaluation_test.failed_expect_write (1 ms)
6: [ RUN      ] single_evaluation_test.write_tests
6: [       OK ] single_evaluation_test.write_tests (0 ms)
6: [----------] 6 tests from single_evaluation_test (7 ms total)
6:
6: [----------] 11 tests from gtest_extra_test
6: [ RUN      ] gtest_extra_test.expect_write
6: [       OK ] gtest_extra_test.expect_write (1 ms)
6: [ RUN      ] gtest_extra_test.expect_write_streaming
6: [       OK ] gtest_extra_test.expect_write_streaming (0 ms)
6: [ RUN      ] gtest_extra_test.expect_throw_no_unreachable_code_warning
6: [       OK ] gtest_extra_test.expect_throw_no_unreachable_code_warning (0 ms)
6: [ RUN      ] gtest_extra_test.expect_system_error_no_unreachable_code_warning
6: [       OK ] gtest_extra_test.expect_system_error_no_unreachable_code_warning (0 ms)
6: [ RUN      ] gtest_extra_test.expect_throw_behaves_like_single_statement
6: [       OK ] gtest_extra_test.expect_throw_behaves_like_single_statement (0 ms)
6: [ RUN      ] gtest_extra_test.expect_system_error_behaves_like_single_statement
6: [       OK ] gtest_extra_test.expect_system_error_behaves_like_single_statement (0 ms)
6: [ RUN      ] gtest_extra_test.expect_write_behaves_like_single_statement
6: [       OK ] gtest_extra_test.expect_write_behaves_like_single_statement (0 ms)
6: [ RUN      ] gtest_extra_test.expect_throw_msg
6: [       OK ] gtest_extra_test.expect_throw_msg (0 ms)
6: [ RUN      ] gtest_extra_test.expect_system_error
6: [       OK ] gtest_extra_test.expect_system_error (0 ms)
6: [ RUN      ] gtest_extra_test.expect_throw_msg_streaming
6: [       OK ] gtest_extra_test.expect_throw_msg_streaming (0 ms)
6: [ RUN      ] gtest_extra_test.expect_system_error_streaming
6: [       OK ] gtest_extra_test.expect_system_error_streaming (0 ms)
6: [----------] 11 tests from gtest_extra_test (3 ms total)
6:
6: [----------] 6 tests from output_redirect_test
6: [ RUN      ] output_redirect_test.scoped_redirect
6: [       OK ] output_redirect_test.scoped_redirect (1 ms)
6: [ RUN      ] output_redirect_test.flush_error_in_ctor
6: [       OK ] output_redirect_test.flush_error_in_ctor (0 ms)
6: [ RUN      ] output_redirect_test.dup_error_in_ctor
6: /home/builder/external/fmt/test/gtest-extra-test.cc:358: Failure
6: redir.reset(new output_redirect(f.get())) throws an exception with a different message.
6: Expected: cannot duplicate file descriptor 4: Bad file descriptor
6:   Actual: cannot flush stream: Bad file descriptor
6: [  FAILED  ] output_redirect_test.dup_error_in_ctor (0 ms)
6: [ RUN      ] output_redirect_test.restore_and_read
6: [       OK ] output_redirect_test.restore_and_read (0 ms)
6: [ RUN      ] output_redirect_test.flush_error_in_restore_and_read
6: [       OK ] output_redirect_test.flush_error_in_restore_and_read (0 ms)
6: [ RUN      ] output_redirect_test.error_in_dtor
6: [       OK ] output_redirect_test.error_in_dtor (1 ms)
6: [----------] 6 tests from output_redirect_test (5 ms total)
6:
6: [----------] Global test environment tear-down
6: [==========] 23 tests from 3 test suites ran. (18 ms total)
6: [  PASSED  ] 22 tests.
6: [  FAILED  ] 1 test, listed below:
6: [  FAILED  ] output_redirect_test.dup_error_in_ctor
6:
6:  1 FAILED TEST
1/1 Test #6: gtest-extra-test .................***Failed    0.04 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   0.05 sec

The following tests FAILED:
	  6 - gtest-extra-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.


kartikeya$ ctest -R xchar-test -V
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
Test project /home/builder/external/fmt-bin
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 17
    Start 17: xchar-test

17: Test command: /home/builder/external/fmt-bin/bin/xchar-test
17: Working Directory: /home/builder/external/fmt-bin/test
17: Test timeout computed to be: 10000000
17: [==========] Running 37 tests from 9 test suites.
17: [----------] Global test environment set-up.
17: [----------] 1 test from is_string_test/0, where TypeParam = char
17: [ RUN      ] is_string_test/0.is_string
17: [       OK ] is_string_test/0.is_string (0 ms)
17: [----------] 1 test from is_string_test/0 (0 ms total)
17:
17: [----------] 1 test from is_string_test/1, where TypeParam = wchar_t
17: [ RUN      ] is_string_test/1.is_string
17: [       OK ] is_string_test/1.is_string (0 ms)
17: [----------] 1 test from is_string_test/1 (0 ms total)
17:
17: [----------] 1 test from is_string_test/2, where TypeParam = char16_t
17: [ RUN      ] is_string_test/2.is_string
17: [       OK ] is_string_test/2.is_string (0 ms)
17: [----------] 1 test from is_string_test/2 (0 ms total)
17:
17: [----------] 1 test from is_string_test/3, where TypeParam = char32_t
17: [ RUN      ] is_string_test/3.is_string
17: [       OK ] is_string_test/3.is_string (0 ms)
17: [----------] 1 test from is_string_test/3 (0 ms total)
17:
17: [----------] 21 tests from xchar_test
17: [ RUN      ] xchar_test.format_explicitly_convertible_to_wstring_view
17: [       OK ] xchar_test.format_explicitly_convertible_to_wstring_view (0 ms)
17: [ RUN      ] xchar_test.format
17: [       OK ] xchar_test.format (3 ms)
17: [ RUN      ] xchar_test.is_formattable
17: [       OK ] xchar_test.is_formattable (0 ms)
17: [ RUN      ] xchar_test.compile_time_string
17: [       OK ] xchar_test.compile_time_string (0 ms)
17: [ RUN      ] xchar_test.format_custom_char
17: [       OK ] xchar_test.format_custom_char (0 ms)
17: [ RUN      ] xchar_test.format_utf8_precision
17: [       OK ] xchar_test.format_utf8_precision (0 ms)
17: [ RUN      ] xchar_test.format_to
17: [       OK ] xchar_test.format_to (0 ms)
17: [ RUN      ] xchar_test.vformat_to
17: [       OK ] xchar_test.vformat_to (0 ms)
17: [ RUN      ] xchar_test.format_as
17: [       OK ] xchar_test.format_as (0 ms)
17: [ RUN      ] xchar_test.named_arg_udl
17: [       OK ] xchar_test.named_arg_udl (0 ms)
17: [ RUN      ] xchar_test.print
17: [       OK ] xchar_test.print (0 ms)
17: [ RUN      ] xchar_test.join
17: [       OK ] xchar_test.join (0 ms)
17: [ RUN      ] xchar_test.enum
17: [       OK ] xchar_test.enum (0 ms)
17: [ RUN      ] xchar_test.streamed
17: [       OK ] xchar_test.streamed (0 ms)
17: [ RUN      ] xchar_test.sign_not_truncated
17: [       OK ] xchar_test.sign_not_truncated (0 ms)
17: [ RUN      ] xchar_test.chrono
17: [       OK ] xchar_test.chrono (0 ms)
17: [ RUN      ] xchar_test.color
17: [       OK ] xchar_test.color (0 ms)
17: [ RUN      ] xchar_test.ostream
17: [       OK ] xchar_test.ostream (0 ms)
17: [ RUN      ] xchar_test.format_map
17: [       OK ] xchar_test.format_map (0 ms)
17: [ RUN      ] xchar_test.escape_string
17: [       OK ] xchar_test.escape_string (0 ms)
17: [ RUN      ] xchar_test.to_wstring
17: [       OK ] xchar_test.to_wstring (0 ms)
17: [----------] 21 tests from xchar_test (6 ms total)
17:
17: [----------] 1 test from format_test
17: [ RUN      ] format_test.wide_format_to_n
17: [       OK ] format_test.wide_format_to_n (0 ms)
17: [----------] 1 test from format_test (0 ms total)
17:
17: [----------] 1 test from chrono_test_wchar
17: [ RUN      ] chrono_test_wchar.time_point
17: [       OK ] chrono_test_wchar.time_point (2 ms)
17: [----------] 1 test from chrono_test_wchar (2 ms total)
17:
17: [----------] 9 tests from locale_test
17: [ RUN      ] locale_test.localized_double
17: [       OK ] locale_test.localized_double (0 ms)
17: [ RUN      ] locale_test.format
17: [       OK ] locale_test.format (0 ms)
17: [ RUN      ] locale_test.format_detault_align
17: [       OK ] locale_test.format_detault_align (0 ms)
17: [ RUN      ] locale_test.format_plus
17: [       OK ] locale_test.format_plus (0 ms)
17: [ RUN      ] locale_test.wformat
17: [       OK ] locale_test.wformat (3 ms)
17: [ RUN      ] locale_test.int_formatter
17: [       OK ] locale_test.int_formatter (0 ms)
17: [ RUN      ] locale_test.complex
17: [       OK ] locale_test.complex (0 ms)
17: [ RUN      ] locale_test.chrono_weekday
17: /home/builder/external/fmt/test/xchar-test.cc:627: Failure
17: Value of: (std::vector<std::wstring>{L"\x43F\x43D", L"\x41F\x43D", L"\x43F\x43D\x434", L"\x41F\x43D\x434"})
17: Expected: contains at least one element that is equal to L"Mon"
17:   Actual: { L"\x43F\x43D", L"\x41F\x43D", L"\x43F\x43D\x434", L"\x41F\x43D\x434" }
17: [  FAILED  ] locale_test.chrono_weekday (2 ms)
17: [ RUN      ] locale_test.sign
17: [       OK ] locale_test.sign (0 ms)
17: [----------] 9 tests from locale_test (7 ms total)
17:
17: [----------] 1 test from std_test_xchar
17: [ RUN      ] std_test_xchar.optional
17: [       OK ] std_test_xchar.optional (0 ms)
17: [----------] 1 test from std_test_xchar (0 ms total)
17:
17: [----------] Global test environment tear-down
17: [==========] 37 tests from 9 test suites ran. (18 ms total)
17: [  PASSED  ] 36 tests.
17: [  FAILED  ] 1 test, listed below:
17: [  FAILED  ] locale_test.chrono_weekday
17:
17:  1 FAILED TEST
1/1 Test #17: xchar-test .......................***Failed    0.04 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   0.05 sec

The following tests FAILED:
	 17 - xchar-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.


kartikeya$ ctest -R posix-mock-test -V
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/builder/external/fmt-bin/DartConfiguration.tcl
Test project /home/builder/external/fmt-bin
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 19
    Start 19: posix-mock-test

19: Test command: /home/builder/external/fmt-bin/bin/posix-mock-test
19: Working Directory: /home/builder/external/fmt-bin/test
19: Test timeout computed to be: 10000000
19: [==========] Running 18 tests from 4 test suites.
19: [----------] Global test environment set-up.
19: [----------] 1 test from os_test
19: [ RUN      ] os_test.getpagesize
19: [       OK ] os_test.getpagesize (4 ms)
19: [----------] 1 test from os_test (4 ms total)
19:
19: [----------] 12 tests from file_test
19: [ RUN      ] file_test.open_retry
19: [       OK ] file_test.open_retry (1 ms)
19: [ RUN      ] file_test.close_no_retry_in_dtor
19: [       OK ] file_test.close_no_retry_in_dtor (1 ms)
19: [ RUN      ] file_test.close_no_retry
19: [       OK ] file_test.close_no_retry (0 ms)
19: [ RUN      ] file_test.size
19: [       OK ] file_test.size (0 ms)
19: [ RUN      ] file_test.max_size
19: [       OK ] file_test.max_size (0 ms)
19: [ RUN      ] file_test.read_retry
19: [       OK ] file_test.read_retry (0 ms)
19: [ RUN      ] file_test.write_retry
19: [       OK ] file_test.write_retry (0 ms)
19: [ RUN      ] file_test.dup_no_retry
19: [       OK ] file_test.dup_no_retry (0 ms)
19: [ RUN      ] file_test.dup2_retry
19: [       OK ] file_test.dup2_retry (0 ms)
19: [ RUN      ] file_test.dup2_no_except_retry
19: [       OK ] file_test.dup2_no_except_retry (0 ms)
19: [ RUN      ] file_test.pipe_no_retry
19: [       OK ] file_test.pipe_no_retry (0 ms)
19: [ RUN      ] file_test.fdopen_no_retry
19: [       OK ] file_test.fdopen_no_retry (0 ms)
19: [----------] 12 tests from file_test (5 ms total)
19:
19: [----------] 4 tests from buffered_file_test
19: [ RUN      ] buffered_file_test.open_retry
19: [       OK ] buffered_file_test.open_retry (0 ms)
19: [ RUN      ] buffered_file_test.close_no_retry_in_dtor
19: [       OK ] buffered_file_test.close_no_retry_in_dtor (0 ms)
19: [ RUN      ] buffered_file_test.close_no_retry
19: [       OK ] buffered_file_test.close_no_retry (0 ms)
19: [ RUN      ] buffered_file_test.fileno_no_retry
19: /home/builder/external/fmt/test/posix-mock-test.cc:435: Failure
19: Expected: (f.descriptor)() throws an exception of type std::system_error.
19:   Actual: it throws nothing.
19: /home/builder/external/fmt/test/posix-mock-test.cc:436: Failure
19: Expected equality of these values:
19:   2
19:   fileno_count
19:     Which is: 1
19: [  FAILED  ] buffered_file_test.fileno_no_retry (1 ms)
19: [----------] 4 tests from buffered_file_test (2 ms total)
19:
19: [----------] 1 test from scoped_mock
19: [ RUN      ] scoped_mock.scope
19: [       OK ] scoped_mock.scope (0 ms)
19: [----------] 1 test from scoped_mock (0 ms total)
19:
19: [----------] Global test environment tear-down
19: [==========] 18 tests from 4 test suites ran. (14 ms total)
19: [  PASSED  ] 17 tests.
19: [  FAILED  ] 1 test, listed below:
19: [  FAILED  ] buffered_file_test.fileno_no_retry
19:
19:  1 FAILED TEST
1/1 Test #19: posix-mock-test ..................***Failed    0.03 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) =   0.05 sec

The following tests FAILED:
	 19 - posix-mock-test (Failed)
Errors while running CTest
Output from these tests are in: /home/builder/external/fmt-bin/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.
@vitaut
Copy link
Contributor

vitaut commented Oct 8, 2023

chrono-test and xchar-test failures indicate that locales are broken on your platform. Not sure if we can do much about it.

@seanm
Copy link
Contributor Author

seanm commented Nov 6, 2023

Do you think it could be because of something I didn't install or configure? It's a very vanilla installation I did.

@vitaut
Copy link
Contributor

vitaut commented Nov 7, 2023

The test is written such that it would pass even if you didn't have the correct locale installed. The problem is that your standard library is lying about the presence of locales: it successfully constructs a locale but the constructed locale behaves as classic/English.

@vitaut
Copy link
Contributor

vitaut commented Nov 7, 2023

What standard library do you use and what does this code print on your system?

#include <ctime>
#include <iomanip>
#include <iostream>

int main() {
  std::time_t t = std::time(nullptr);
  std::cout.imbue(std::locale("es_ES.UTF-8"));
  std::cout << std::put_time(std::localtime(&t), "%a");
}

@vitaut
Copy link
Contributor

vitaut commented Nov 8, 2023

I have a tentative fix for the posix-mock-test: 6b0082e.

@seanm
Copy link
Contributor Author

seanm commented Nov 8, 2023

What standard library do you use

Looks like OpenBSD uses LLVM's: https://man.openbsd.org/intro.3

and what does this code print on your system?

kartikeya$ uname -a
OpenBSD kartikeya.rogue-research.com 7.4 GENERIC.MP#1397 amd64
kartikeya$ cat test.cxx
#include <ctime>
#include <iomanip>
#include <iostream>

int main() {
  std::time_t t = std::time(nullptr);
  std::cout.imbue(std::locale("es_ES.UTF-8"));
  std::cout << std::put_time(std::localtime(&t), "%a");
}
kartikeya$ clang++ test.cxx
kartikeya$ ./a.out
Wedkartikeya$

@seanm
Copy link
Contributor Author

seanm commented Nov 8, 2023

Whereas on my Mac I get "mar"; for 'martes' I assume? today being Tuesday. So wait, "Wed" makes no sense at all... oh wait the clock is wrong in that VM!

@seanm
Copy link
Contributor Author

seanm commented Nov 8, 2023

OK, after a magic reboot, it now outputs "Tue" which is at least the right day, but the wrong language.

(On my test FreeBSD VM, I also get "mar", like on my Mac.)

@vitaut
Copy link
Contributor

vitaut commented Nov 8, 2023

Giving the output in the wrong language looks like a bug. It should either respect the locale or give an error (https://godbolt.org/z/KoW8YPqfc) when constructing an unsupported locale.

Could be related to (https://man.openbsd.org/setlocale.3):

On OpenBSD, the only useful value for the category is LC_CTYPE. It sets the locale used for character encoding, character classification, and case conversion. For compatibility with natural language support in packages(7), all other categories — LC_COLLATE, LC_MESSAGES, LC_MONETARY, LC_NUMERIC, and LC_TIME — can be set and retrieved, too, but their values are ignored by the OpenBSD C library. A category of LC_ALL sets the entire locale generically, which is strongly discouraged for security reasons in portable programs.

@vitaut
Copy link
Contributor

vitaut commented Nov 9, 2023

@seanm, could you report the issue to libc++ with the put_time example as a repro?

@seanm
Copy link
Contributor Author

seanm commented Nov 9, 2023

I can certainly do that, yes.

Do you think I should report to OpenBSD or to LLVM libc++?

@vitaut
Copy link
Contributor

vitaut commented Nov 9, 2023

I suggest reporting to LLVM libc++ even if there might be some underlying issue with system APIs.

@seanm
Copy link
Contributor Author

seanm commented Nov 9, 2023

llvm/llvm-project#71871

@seanm
Copy link
Contributor Author

seanm commented Nov 16, 2023

So it's not a libc++ bug. Filed here instead: https://marc.info/?l=openbsd-bugs&m=170010336105473&w=2

@ischwarze
Copy link

This is not a bug in {fmt} but intentional behaviour in the OpenBSD C library for security reasons and well documented on our locale(1), setlocale(3), wcsftime(3), and strftime(3) manual pages. Please close this ticket as invalid.

@vitaut
Copy link
Contributor

vitaut commented Nov 24, 2023

Disabled locale-specific chrono tests in bea7ecc giving a warning instead and gtest-extra-test should be fixed in ffa5b14.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants