Here's a function polymorphic in 3 types:

:t (.)
(.) :: (b -> c) -> (a -> b) -> a -> c

and here a non polymorphic function:

:t Data.Char.digitToInt
Data.Char.digitToInt :: Char -> Int

If we apply the former to the latter, we get a function polymorphic in 1 type:

:t (.) Data.Char.digitToInt
(.) Data.Char.digitToInt :: (a -> Char) -> a -> Int

which means that (.) was "instantiated" (I'm not sure this is the correct term; as a C++ programmer, I'd call it so) with b === Char and c === Int, so the signature of the (.) that gets applied to digitToInt is the following

(Char -> Int) -> (a -> Char) -> a -> Int

My question is: is there a way to have this signature printed on screen, given (.), digitToInt and the "information" that I want to apply the former to the latter?

For who's interested, this question was earlier closed as duplicate of this one.

5

There are 5 best solutions below

0
On BEST ANSWER

Other answers require the help of functions that have been defined with artificially restricted types, such as the asTypeOf function in the answer from HTNW. This is not necessary, as the following interaction shows:

Prelude> let asAppliedTo f x = const f (f x)

Prelude> :t head `asAppliedTo` "x"
head `asAppliedTo` "x" :: [Char] -> Char

Prelude> :t (.) `asAppliedTo` Data.Char.digitToInt
(.) `asAppliedTo` Data.Char.digitToInt
  :: (Char -> Int) -> (a -> Char) -> a -> Int

This exploits the lack of polymorphism in the lambda-binding that is implicit in the definition of asAppliedTo. Both occurrences of f in its body must be given the same type, and that is the type of its result. The function const used here also has its natural type a -> b -> a:

const x y = x
0
On

I think @HTNW's answer probably covers it, but for completeness, here's how the inContext solution works in detail.

The type signature of the function:

inContext :: a -> (a -> b) -> a

means that, if you have a thing you want to type, and a "context" in which it's used (expressible as a lambda that takes it as an argument), say with types:

thing :: a1
context :: a2 -> b

You can force unification of a1 (the general type of thing) with a2 (the constraints of the context) simply by constructing the expression:

thing `inContext` context

Normally, the unified type thing :: a would be lost, but the type signature of inContext implies that the type of this whole resulting expression will also be unified with the desired type a, and GHCi will happily tell you the type of that expression.

So the expression:

(.) `inContext` \hole -> hole digitToInt

ends up getting assigned the type that (.) would have within the specified context. You can write this, somewhat misleadingly, as:

(.) `inContext` \(.) -> (.) digitToInt

since (.) is as good an argument name for an anonymous lambda as hole is. This is potentially confusing, since we're creating a local binding that shadows the top-level definition of (.), but it's still naming the same thing (with a refined type), and this abuse of lambdas allowed us to write the original expression (.) digitToInt verbatim, with the appropriate boilerplate.

It's actually irrelevant how inContext is defined, if you're just asking GHCi for its type, so inContext = undefined would have worked. But, just looking at the type signature, it's easy enough to give inContext a working definition:

inContext :: a -> (a -> b) -> a
inContext a _ = a

It turns out that this is just the definition of const, so inContext = const works, too.

You can use inContext to type multiple things at once, and they can be expressions instead of names. To accommodate the former, you can use tuples; for the latter to work, you have use more sensible argument names in your lambas.

So, for example:

λ> :t (fromJust, fmap length) `inContext` \(a,b) -> a . b
(fromJust, fmap length) `inContext` \(a,b) -> a . b
  :: Foldable t => (Maybe Int -> Int, Maybe (t a) -> Maybe Int)

tells you that in the expression fromJust . fmap length, the types have been specialized to:

fromJust :: Maybe Int -> Int
fmap length :: Foldable t => Maybe (t a) -> Maybe Int
1
On

You can do that using the TypeApplications extension, which allow you to explicitly specify which types you want to use to instantiate type parameters:

λ :set -XTypeApplications                                 
λ :t (.) @Char @Int
(.) @Char @Int :: (Char -> Int) -> (a -> Char) -> a -> Int

Note that the arguments must be in the exact order.

For functions that have a "regular" type signature like foo :: a -> b, the order is defined by the order in which the type parameters first appear in the signature.

For functions that use ExplicitForall like foo :: forall b a. a -> b, the order is defined by whatever it is in forall.


If you want to figure out the type specifically based on applying (.) to digitToChar (as opposed to just knowing which types to fill), I'm pretty sure you can't in GHCi, but I can highly recommend Haskell IDE support.

For example, here's how it looks for me in VSCode (here's the extension):

enter image description here

0
On

This is a minor variation on HTNW's answer.

Suppose we have any, potentially large, expression involving a polymorphic identifier poly

 .... poly ....

and we wonder how the polymorphic type was instantiated at that point.

This can be done exploiting two features of GHC: asTypeOf (as mentioned by HTNW) and typed holes, as follows:

 .... (poly `asTypeOf` _) ....

Upon reading the _ hole, GHC will generate an error reporting the type of the term that should be entered in place of that hole. Since we used asTypeOf, this must be the same as the type of the particular instance of poly we need in that context.

Here's an example in GHCi:

> ((.) `asTypeOf` _) Data.Char.digitToInt
<interactive>:11:17: error:
    * Found hole: _ :: (Char -> Int) -> (a -> Char) -> a -> Int
1
On

There's this neat little function hidden in a corner of the Prelude:

Prelude.asTypeOf :: a -> a -> a
asTypeOf x _ = x

It's documented as "forcing its first argument to have the same type as the second." We can use this to force the type of (.)'s first argument:

-- (.) = \x -> (.) x = \x -> (.) $ x `asTypeOf` Data.Char.digitToInt
-- eta expansion followed by definition of asTypeOf
-- the RHS is just (.), but restricted to arguments with the same type as digitToInt
-- "what is the type of (.) when the first argument is (of the same type as) digitToInt?"
ghci> :t \x -> (.) $ x `asTypeOf` Data.Char.digitToInt
\x -> (.) $ x `asTypeOf` Data.Char.digitToInt
  :: (Char -> Int) -> (a -> Char) -> a -> Int

Of course, this works for as many arguments as you need.

ghci> :t \x y -> (x `asTypeOf` Data.Char.digitToInt) . (y `asTypeOf` head)
\x y -> (x `asTypeOf` Data.Char.digitToInt) . (y `asTypeOf` head)
  :: (Char -> Int) -> ([Char] -> Char) -> [Char] -> Int

You can consider this a variation of @K.A.Buhr's idea in the comments—using a function with a signature more restrictive than its implementation to guide type inference—except we don't have to define anything ourselves, at the cost of not being able to just copy the expression in question under a lambda.