I'm used to typescript, in which one can use a !
to tell the type-checker to assume a value won't be null. Is there something analogous when using type annotations in python?
A (contrived) example:
When executing the expression m.maybe_num + 3
in the code below, the enclosing if
guarantees that maybe_num
won't be None
. But the type-checker doesn't know that, and returns an error. (Verified in https://mypy-play.net/?mypy=latest&python=3.10.) How can I tell the type-checker that I know better?
from typing import Optional
class MyClass:
def __init__(self, maybe_num: Optional[int]):
self.maybe_num = maybe_num
def has_a_num(self) -> bool:
return self.maybe_num is not None
def three_more(self) -> Optional[int]:
if self.has_a_num:
# mypy error: Unsupported operand types for + ("None" and "int")
return self.maybe_num + 3
else:
return None
Sadly there's no clean way to infer the type of something from a function call like this, but you can work some magic with
TypeGuard
annotations for thehas_a_num()
method, although the benefit from those annotations won't really be felt unless the difference is significantly more major than the type of a single int. If it's just a single value, you should just use a standard is not None check.You can define a subclass of your primary subclass, where the types of any parameters whose types are affected are explicitly redeclared.
From there, your checker function should still return a boolean, but the annotated return type tells MyPy that it should use it for type narrowing to the listed type.
Sadly it will only do this for proper function parameters, rather than the implicit
self
argument, but this can be fixed easily enough by providing self explicitly as follows:That syntax is yucky, but it works with MyPy.
This makes the full solution be as follows
TypeGuard
was added in Python 3.10, but can be used in earlier versions using thetyping_extensions
module frompip
.