You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would really be ideal if obj was narrowed to { foo: string } & object, and narrowing obj.foo would just "fall out" from narrowing obj itself.
Can imagine that we just stack intersections as we learn more
e.g. object & { foo: unknown } & { foo: string } which simplifies to object & { foo: string }
The problem with doing that is you would end up with huge types in some cases - which adds visual noise and impacts performance.
Additionally, you can "learn" information by introducing intersections when narrowing types - but when you join from two branches, how do you know which intersections were "learned" from type guards vs. which intersections were already there.
Feels doable, making this efficient also makes this harder.
We eagerly normalize intersections which is part of what makes this challenging.
There's a silver lining - simplifying narrowing to just narrow the roots of references would make us more efficient in other ways.
Not looking good. Prefer separate flags to be composed together, e.g. moduleResolution: { resolveFolderImport, resolveJSONModule, implyExtension, usesExportField, moduleSuffixes } and moduleResolution: "what-ever-we-have-today" becomes the "preset" of those flags.
--allowImportingTsExtensions Requires --noEmit
or you can emit .ts files with TS syntax removed.
customConditions
if you mean the conditions in package.json exports field, I think it should be supported, but I suspect how many people actually use it.
See #49083 (comment) for some explanation related to extension rewriting.
In practice, my understanding of the different situations is as follows:
Regarding use .ts extensions in the imports:
it is now possible with the --allowImportingTsExtensions option;
that makes the source code conummable by deno (the imported paths exist on disk);
but that makes your source code non-transpilable to js anymore (--noEmit is required).
Regarding imports without extension, typescript transpiles that to .js files without rewriting the imports. So you need a bundler / third party tool to consumme the transpiled code.
In general, the idea of the typescript team is that the typescript files should not be transpiled but consummed directly.
--moduleResolution hybrid
classic
mode)?node_modules
package lookup--moduleResolution node16
/nodenext
index.*
file)--moduleResolution node16
/nodenext
exports
look-up frompackage.json
--moduleResolution node
*.ts
imports--allowImportingTsExtensions
--allowImportingTsExtensions
--noEmit
When we emit, we will not rewrite the import paths. So
will remain the same in the output
.js
file:and this will fail in most tools if
foo
was also rewritten to a JavaScript file namedfoo.js
!--moduleResolution bundler
?bundler
communicates more thanbundler
allowImportingTsExtensions
allowImportingTsExtensions
(already mentioned)resolvePackageJsonExports
- resolve from theexports
field the way Node.js 12+ does today (which we do under thenode12
/nodenext
flag)resolvePackageJsonImports
- resolve from theimports
field the way Node.js 16+ does today (which we do under thenode12
/nodenext
flag)customConditions
customConditions
needDeprecation Plan
--ignoreDeprecations
"5.0"
to suppress all the errors that 5.x has deprecated."ignoreDeprecations": "5.0"
becomes an error.--noSwitchCaseFallthrough
--out
?module
--charset
?--target es3
"prepend": true
in project references?Using our internal missing type
#51653
in
Operator Narrowing from Negative Checks#51339
If you have
obj: object
and write code likeyou would expect
obj.foo
to be valid and have the typestring
.If you write the negative case and bail our early, you'd expect the same
This is the same problem as the fact that
obj
itself is narrowed in a distinct manner fromobj.foo
.obj
is only narrowed to have type{ "foo": unknown } & object
- not compatible with anything that requires{ foo: string }
.obj.foo
is narrowed to have the typestring
.typeof x.y
as a discriminant #32399.It would really be ideal if
obj
was narrowed to{ foo: string } & object
, and narrowingobj.foo
would just "fall out" from narrowingobj
itself.Can imagine that we just stack intersections as we learn more
object & { foo: unknown } & { foo: string }
which simplifies toobject & { foo: string }
The problem with doing that is you would end up with huge types in some cases - which adds visual noise and impacts performance.
Additionally, you can "learn" information by introducing intersections when narrowing types - but when you join from two branches, how do you know which intersections were "learned" from type guards vs. which intersections were already there.
There's a silver lining - simplifying narrowing to just narrow the roots of references would make us more efficient in other ways.
Is this compelling?
completionEntryDetails
#51526)The text was updated successfully, but these errors were encountered: