Casting generic class inside generic function seems to change inference of type parameter

190 Views Asked by At

I have a weird situation where by changing T.self inside a generic function to T.self as T.Type, it changes the semantics of the code:

class Foo {
  required init() {}
}
class Bar : Foo {
}
func f<T: Foo>(_:T) -> T {
  return T.self()
}
println(f(Bar())) // prints MyProject.Foo

but

class Foo {
  required init() {}
}
class Bar : Foo {
}
func f<T: Foo>(_:T) -> T {
  return (T.self as T.Type)()
}
println(f(Bar())) // prints MyProject.Bar

This doesn't make sense. The code uses T.self to create an instance of the class of T. Although T could be inferred to Foo or Bar in the call to f in both cases, I would expect it to be inferred to the same thing in both cases, since inference of the type argument should only on the signature and calling code, and the signature and calling code are identical in both cases.

T.self should already be of type T.Type, so casting it should be a no-op (in fact, the compiler shouldn't even allow the cast, since it should always be true). Yet by performing this cast, I seem to be changing the class that I am calling the initializer on. Casting an object should not alter the value of the object if it succeeds, so this is really weird.

1

There are 1 best solutions below

0
On

There's a thread on this in the dev forums right now. The most relevant bits basically point to this being a result of Swift's parametric type system:

Swift's type system is parametric, meaning that a generic function or type behaves identically over its generic type parameters, rather than substitution-based like C++. [...] In a parametric system like Swift, a generic function only has to be type-checked and compiled once, because the behavior does not change per type, and generics can be used across compilation unit boundaries.

There's a lot more exploration and explanation in that thread.