------------- Motivation and context (Not essential to understand the question)----------------
In my typescript app, I have objects with references to other objects. Sometimes these references are populated, sometimes not. Example:
type Entity = {
ref1: string | Obj,
ref2: string | Obj
}
I wanted to introduce a generic parameter T that allows me to specify which of the props are populated. If I don't pass T, all references need to be strings. If I pass T = {ref1: Obj}, then ref1 needs to be populated, but ref2 must not. And if I pass T = {ref1:Obj, ref2:Obj}, both references need to be populated. The most flexible way to write this turned out to be something like:
type Entity<T extends Partial<{ref1: string|Obj, ref2:string|Obj}> = {}> = {
ref1: T["ref1"] extends string|Obj ? T["ref1"] : string,
ref2: T["ref2"] extends string|Obj ? T["ref2"] : string,
}
---------- The question / bug ------------------------
All was working well until I realized a place in my code where typescript should have thrown an error, but didn't. I simplified the above type to investigate better, since I know union types with conditional types can cause unexpected behavior. I came up with this weird result in typescript playground:
--- Minimal reproducible example (Playground link)---
type Entity<T extends {ref1?:unknown}> = {
ref1: T["ref1"] extends string ? T["ref1"]: number;
}
type A = Entity<{ref1: string}> extends Entity<{}> ? true : false // correctly true
type B = Entity<{ref1: number}> extends Entity<{}> ? true : false // Incorrectly true
type C = Entity<{ref1: number}> extends Entity<{ref1: undefined}> ? true : false // result: correctly returns false
type D = Entity<{ref1: string}> extends Entity<{ref1: undefined}> ? true : false // result: Incorrectly returns false
But that makes no sense: In type B, Entity<{ref1: number}> simplifies to a type of {ref1: number} whereas Entity<{}> simplifies to a type of {ref1: string} which are incompatible.
When I am more explicit, in type C, typescript understands correctly.
Is there something I don't understand about typescript that explains that behaviour, or is it a typescript bug?
A fundamental feature of TypeScript's type system is that it's structural and not nominal. This is in contrast to type systems like Java's where
interface A {}andinterface B {}are considered different types because they have different declarations. In TypeScript, when typesAandBhave the same structure then they are the same type, regardless of their declarations. TypeScript's type system is structural.Except when it's not.
Comparing two types structurally can be expensive for the type checker. Imagine two deeply nested or even recursive types
XandY, and the compiler has to check whetherXis a subtype ofYbecause you are trying to assign a value of typeXto a variable of typeY. The compiler needs to start checking each property ofYwith that ofX, and each subproperty, and each subsubproperty, until it finds an incompatibility or until it reaches a point where it can stop checking (either it reaches the end of the tree structure or it finds something it's already checked). If this were the only type comparison method available to the TypeScript compiler, then it would be very slow.So instead the compiler sometimes takes a shortcut. When you define a generic object type, the compiler measures the type's variance (see Difference between Variance, Covariance, Contravariance and Bivariance in TypeScript ) from its definition, and marks it as such for later use. For example in:
the compiler marks those as covariant, contravariant, invariant, and bivariant, respectively. That makes sense with the type system:
When the compiler assigns those markers to a complicated generic type
F<T>, a lot of time can be saved when comparingF<X>toF<Y>. Instead of having to plugXandYintoFand then compare the results, it can just compareXandYand use the variance marker onFto compute the desired result. In such cases, the compiler can just treatF<T>as a black box.If you have another complicated generic type
G<T>, though, the compiler can't use a shortcut to compareF<X>toG<Y>, sinceGcould be completely unrelated toF. Even if it turns out thatFandG's definitions are the same, the compiler wouldn't know that unless it compares those definitions, meaning the black box needs to be opened. A full structural comparison is the only option here.So here we have a situation where
F<X> extends G<Y>results in one code path for the compiler, butF<X> extends F<Y>results in another code path. That sure looks like the compiler is comparing those types nominally, based on their declarations.But such a difference is unobservable, right? Because the type system is completely sound, right? If the compiler assigns a variance marker, it does so correctly, right? And subtyping is a transitive relationship, so if
X extends YandY extends ZthenX extends Z, and ifF<T>is marked covariant inTthenF<X> extends F<Y>andF<Y> extends F<Z>and alsoF<X> extends F<Z>, right?Right?
No, of course not, not completely. TypeScript's type system is intentionally unsound in places where convenience has been considered more important. One such place is with optional properties and optional parameters. The compiler lets you assign a value of type
{x: string}to a variable of type{x: string, y?: number}, for convenience:But this is unsafe, because really a missing property definition has a completely
unknownvalue at runtime:So you have a situation where
YB extends XandX extends YNare true, butYB extends YNis false.See microsoft/TypeScript#42479 and the issues linked within for more information.
If you have a type function
F<T>that's marked covariant inT, then what should happen? PresumablyF<YB> extends F<X>andF<X> extends F<YN>are true but this can easily be violated ifFis indexing into that optional property and doing something with its type. That's more or less what's happening with your code:The compiler thinks
Entityis covariant (or maybe bivariant) and so the result istruebecause{ref1: string} extends {}. But indexing into theref1property of{}isunknown, and soEntityreally shouldn't be covariant (maybe?):Maybe the variance marker is being assigned incorrectly? It's hard to say. Given the unsoundness around optional properties, there is probably no "correct" marker and the compiler just picks something that behaves well in a wide range of real-world situations. That's just how it is sometimes; see microsoft/TypeScript#43608 for example.
So what can be done? Well you could refactor completely, but in cases where you don't like the variance marker assigned by the compiler, you can assign your own (as long as the compiler doesn't see a conflict) using the optional type parameter variance annotations
in(for contravariance),out(for covariance), orin out(for invariance). So maybe:which makes
Entitybehave as if it were invariant in its parameter, and thereforeEntity<T> extends Entity<U>will be false unlessTandUare the same type. That's not necessarily what you want to do, and it's not true in general, but at least it's a lever you can pull to change this behavior.Playground link to code