-
Notifications
You must be signed in to change notification settings - Fork 12.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reconsider assignability of number to enum type. #26362
Comments
I don't see this as enough to change our mind here; this would be a very large breaking change and would imply bifurcating flag and nonflag enums. See #17734, #21546, #11559, #15591, #8020, #18409, etc. You can write this today if your enums have type Test<T> = T extends 0 | 1
? "number-enum" : "number"; |
The behavior is motivated by bitwise operations. There are times when I think if we did TypeScript over again and still had enums, we'd have made a separate construct for bit flags. |
That is not a viable solution, because I need to generically detect that a type is more specific than simply @DanielRosenwasser In the case of bit flag enums, the result of ORing multiple bits together doesn't even produce a value that matches any of the enum values, so it doesn't seem right at all to treat the result as the enum type (implicitly, or explicitly via type casting). Multiple bits ORed together really is just a number. Based on the duplicate issues linked above, it doesn't seem uncommon that developers would expect/desire more strictness with enum types. My gut feeling is that the current behavior may have helped TypeScript enums be more palatable earlier on when developers transitioning from JavaScript were more skeptical of all this type strictness. The community as a whole seems to be embracing stricter typing and its benefits more as time goes on. How do we determine if/when the community as a whole has reached the point where they would prefer stricter enum behavior while accepting that "bit flags" would be "broken"? Consider that the breaking change, while widespread, would be very straightforward to deal with. Compiler errors would very clearly indicate the enum type that you must now type cast to in order to migrate the code to deal with the breaking change. It's probably so straightforward that a script could be developed to automatically migrate code to use enum type casting where necessary (if the developer wants a quick/effortless solution without taking the opportunity to review each instance for potential actual incorrect code uncovered by the new strictness). The migrated code would also be backward compatible with the previous version of TypeScript. A possibly more pragmatic approach: Is it feasible to add this as a compiler option (strictNumericEnums)? |
BTW - I did just realize the migration for the breaking change would not be quite as simple as I suggested. If developers are declaring properties/params as bit flag enum types with the intent that they will hold values formed from combining multiple flags, then those would have to be changed to type Also, if a variable/property is initialized to a single bit flag enum value without being explicitly typed, then its type will be inferred as the enum type and may break code that tries to modify that value with This whole bit flag enum concept just feels so foreign and wrong to me: declaring something as an |
I'm not sure how to communicate this effectively, but we really have thought about this a large amount and weighed pros and cons of complexity vs usefulness. We've taken the feedback and will continue to keep this on our radar for possible future changes. |
Consider it communicated effectively. Thanks for reading. One final thought on bit flag enums specifically: Allowing number -> enum assignment effectively neuters the type safety of declaring a property/param as an enum type. The enum type declaration becomes more of a "hint" for documentation purposes to suggest to an observant developer that maybe they should go look at that enum definition to understand what the different bits mean, and use those enums to conveniently build a value out of those bits: /**
* Do something with some bit flags.
* @param flags - Not really a MyBitFlags value, but a number whose bits are interpreted as defined by MyBitFlags.
*/
function foo(flags: MyBitFlags): void; How is this any more helpful than declaring it as type /**
* Do something with some bit flags.
* @param flags - See {@link MyBitFlags} for bit definitions.
*/
function foo(flags: number): void; |
This is actually extremely helpful. In our own codebase, for example, we have 17 different kinds of flag masks that are never interoperable. The existing behavior makes sure we don't use |
I was surprised by the rule here, so here's a little snippet just in case anyone else was wondering:
|
@RyanCavanaugh Ok, I can see that now. Has there been any serious thought/proposal for something like an automatically derived type for the result of a bitwise operation involving enums? For the sake of example, imagine a special generic type For any types
I'm probably missing some details, but here's examples of how I expect this would work out: enum FooBits {
A = 1 << 0,
B = 1 << 3
}
enum BarBits {
A = 1 << 1,
B = 1 << 4
}
// type: BitFlags<FooBits>
const flags1 = FooBits.A | FooBits.B;
// type: BitFlags<FooBits>
const flags2 = flags1 ^ FooBits.A;
// type: number
const flags3 = flags1 ^ 1;
// type: BitFlags<FooBits.A | BarBits.B>
const flags4 = FooBits.A | BarBits.B;
// type: BitFlags<FooBits.A | BarBits.B>
const flags5 = flags1 | BarBits.B;
// This function ONLY accepts individual FooBits values
function doFooBits(bit: FooBits): void;
// This function accepts bit flag values built from FooBits values
function doFooBitFlags(flags: BitFlags<FooBits>): void;
// valid
doFooBits(FooBits.A);
// ERROR: number not assignable to FooBits
doFooBits(1);
// ERROR: BitFlags<FooBits> not assignable to FooBits
doFooBits(FooBits.A | FooBits.B);
// ERROR: BarBits not assignable to FooBits
doFooBits(BarBits.A);
// valid
doFooBitFlags(FooBits.A);
// valid
doFooBitFlags(FooBits.A | FooBits.B);
// ERROR: number not assignable to BitFlags<FooBits>
doFooBitFlags(1);
// ERROR: BitFlags<FooBits.A | BarBits.B> not assignable to BitFlags<FooBits>
doFooBitFlags(FooBits.A | BarBits.B); Typing of properties/params is now much more clear as to whether it's dealing with actual enum values, or bit flag values built out of enum values. This approach would also allow you to define separate enums for different subgroups of bits that can all be used together in the same flags value: function doFooBarBitFlags(flags: BitFlags<FooBits | BarBits>): void;
// valid
doFoBarBitFlags(FooBits.A | BarBits.B); |
@UselessPickles I'm writing up a separate issue to track possible next steps here as well as outline the current state of the world -- we've painted ourselves into a bit of a corner with enums and it's trickier than it first appears due to some subtle differences in enum behavior depending on how they're declared |
I expected that it's more complicated than it appears. I can only hope that something I've suggested might at least spark an idea for a workable solution. I look forward to reading about the intricacies of the "current state of the world". |
@RyanCavanaugh Have you created the separate issue you mentioned you were writing up? Please mention it here if/when you do. |
EXCITING FOLLOW-UP!!! I found a workaround for my specific example. Using a little trickery involving mapped types and intersections, I can reliably determine if a number-like type is actually Here's my revised and functioning example: enum StringEnum {
A = "A",
B = "B"
}
enum NumberEnum {
A,
B
}
type Test<T> = T extends number
? (true extends ({[key: number]: true} & {[P in T]: false})[number] ? "number-enum" : "number")
: T extends string
? (string extends T ? "string" : "string-enum")
: never;
let testString: Test<string>; // type: "string"
let testStringEnum: Test<StringEnum>; // type: "string-enum"
let testNumber: Test<number>; // type: "number"
let testNumberEnum: Test<NumberEnum>; // type: "number-enum" I consider this to be only a workaround for my specific example, but not a general solution to the overall issue of |
I have another cool workaround for defining a function parameter that requires more strict assignability to a numeric enum type: /**
* Use StrictNumericEnumParam to define the type of a function
* parameter that should be strictly assignable to a numeric enum
* type. This prevents arbitrary numbers from being passed in to
* the parameter, working around TypeScript's intentional decision
* to allow type `number` to be assignable to all numeric enum types.
*
* Instead of writing a function signature as:
* function doSomething(value: MyEnum): void;
*
* Write it like this:
* function doSomething<Value extends MyEnum>(
* value: StrictNumericEnumParam<MyEnum, Value>
* ): void;
*
* StrictNumericEnumParam<MyEnum, Value> will evaluate to `never`
* for any type `Value` that is not strictly assignable to `MyEnum`
* (e.g., type `number`, or any number literal type that is not one
* of the valid values for `MyEnum`), and will produce a compiler
* error such as:
* "Argument of type `number` is not assignable to parameter of type `never`"
*
* LIMITATION:
* This only works for a special subset of numeric enums that are considered
* "Union Enums". For an enum to be compatible, it basically must be a simple
* numeric enum where every member has either an inferred value
* (previous enum member + 1), or a number literal (1, 42, -3, etc.)
*
* If the `Enum` type argument is not a "Union Enum", then this type resolves
* to simply type `Enum` and the use of StrictNumericEnumParam is neither
* beneficial nor detrimental.
*/
type StrictNumericEnumParam<Enum extends number, Param extends Enum> =
true extends (
{ [key: number]: false } & { [P in Enum]: true }
)[Enum] ? (
true extends (
{ [key: number]: false } & { [P in Enum]: true }
)[Param] ? Param : never
) : Enum; Example usage: enum Foo {
A, B, C
}
enum Bar {
D, E, F
}
// typically would be written as function doFoo(value: Foo): void
declare function doFoo<Value extends Foo>(
value: StrictNumericEnumParam<Foo, Value>
): void;
declare const foo: Foo;
declare const bar: Bar;
declare const n: number;
// valid
doFoo(Foo.A);
// valid
doFoo(foo);
// valid, because 1 is exactly one of the enum values
doFoo(1);
// Argument of type `Bar.D` is not assignable to parameter of type `Foo`
doFoo(Bar.D);
// Argument of type `Bar` is not assignable to parameter of type `Foo`
doFoo(bar);
// Argument of type `number` is not assignable to parameter of type `never`
doFoo(n);
// Argument of type `10` is not assignable to parameter of type `never`
doFoo(10); |
Search Terms
number assignable enum
Suggestion
Remove assignability of number to enum types.
Type number is currently assignable to numeric enum types. I am aware that this was intentionally implemented as a special case for convenience.
With recent improvements to generics (particularly conditional types), I believe that the convenience of number -> enum assignability may now be outweighed by the power of "distinctness" of NOT allowing assignability in that direction.
Removing this special case behavior would also probably have the benefit of simplifying some code in the TypeScript compiler and reducing risk of future bugs. Here's an example of a bug that was caused by the special case behavior of number->enum assignability: #10738
Use Cases
If number is NOT assignable to enum types, then it would enable the use of conditional types to distinguish numeric literal/enum types from plain old number, similar to the difference between
string literal/enum types and plain string.
My particular use case is probably too complex to try to present a full example, but here's a high level description...
I am generating typescript code/interfaces from a Swagger document for making API calls. There are two parts to this:
In order to link these two halves together and have strictly typed run-time schema utilities that make use of the compile time types of the "model" interfaces, part of my solution requires a helper type that maps from the type of a property in a "model" interface to the type of that field's schema definition interface.
Currently, I can accurately distinguish between string enum vs plain string types, but I cannot do the same for number enum vs plain number. Numeric enum properties incorrectly have their types mapped to the data schema interface type for plain numeric data.
Examples
Here's an over-simplified example of the type mapping I am attempting, with comments indicating the results I would expect if number is NOT assignable to number enum:
With the current implementation of number->enum assignability, the type of
testNumberEnum
is"number"
instead of"number-enum"
.Aside from that, there's all the usual benefits of the stricter typing/assignability: plain numbers will need to be explicitly cast to an enum type before assigning (forces developer to consider whether it is correct).
Then there's the breaking change and inconvenience: plain numbers will need to be explicitly cast to an enum type before assigning (forces developer to consider whether it is correct).
Checklist
My suggestion meets these guidelines:
The text was updated successfully, but these errors were encountered: