-
Notifications
You must be signed in to change notification settings - Fork 12.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Specify that int and uint are at least 32 bits on every CPU architecture #14758
Comments
No, I like that |
The |
My mistake. Fortunately the RFC probably wouldn't affect x32. Maybe |
A platform can provide any number of registers with varying sizes. Using 32-bit integers on x86_64 / ARM64 rather than 64-bit ones will be faster. Using the largest type implemented in hardware may make sense for cases like a big integer implementation, but it's fine to leave that as a job for libraries. The |
Why would 32-bit integers be faster? x86_64 has instructions for load qwords in one step, and for arithmetic. |
Instruction count doesn't determine performance. The 32-bit integers are smaller (more data in cache), the instructions are faster and the vector instructions can deal with twice as many at a time. |
Correct me if I'm wrong: Rust defines These may not be the most efficient integer types (see @thestinger 's 64-bit example and my 16-bit example) and they shouldn't be used "by default" since the results are non-portable. #9940 proposed to fix those two problems by renaming This issue proposes to make |
Closing in favour of rust-lang/rfcs#161. It doesn't belong as an issue on the tracker anymore. |
Add config for disabling hover memory layout data Requested in rust-lang/rust-analyzer#14748 (comment)
Guaranteeing that
int
anduint
are always at least 32 bits fixes the class of bugs and security holes that worries me most in #9940Scenario:
int
anduint
in library code and sample code explicitly, or by thinking of them as the "default integer types," or via integer type inference. See consider removing the fallback to int for integer inference #6023int
anduint
overflow in 16 bits, which is not difficult.I encountered this problem when programming in C++ on Palm OS. (It's a 16-bit OS even on devices that used a 32-bit ARM to emulate the 68000.)
Of course there are alternate solutions.
The text was updated successfully, but these errors were encountered: