Sentinel objects and mypy type checking
As you may have seen in my post about default parameters and sentinel values I think there's a few situations that tend to lend themselves well to using sentinel objects.
There are however a few really annoying issues with creating a sentinel object that's an instance of
object(). Firstly if you want to annotate your code with type hints, which by the way is something that's almost always worth doing on production code bases, you can start to run into some annoyances with mypy if your code uses sentinels.
For example if you have code you want to add annotations to like:
DEFAULT_VAL = object() def fun(param=DEFAULT_VAL): """ Param is a string that we wish to process""" if param is DEFAULT_VAL: # do default stuff else: # do other stuff
If you have encountered things like this before you'll often find that you will want to annotate the function such that it can accept the
str type and the sentinel type.
def fun(param: Union[int, object]=DEFAULT_VAL):
Making a type annotation for
param is annoying since
DEFAULT_VAL is of type
object which means that if you accept
object as a parameter type you are in effect accepting anything. Now this is obviously not what we want in many cases. There's ways of working around this, for example you can create a base class for sentinels as described in this blog post. That approach at least allows you to not need a union type that accepts an overly broad
object so you'll get better type checking but it still has some edge case runtime issues.
class _Sentinel(object): ... DEFAULT_VAL = _Sentinel() def fun(param: Union[int, _Sentinel]=DEFAULT_VAL):
While this doesn't sacrifice type checking via an overly broad type being accepted it still has a few edge case issues. A potentially annoying edge case is that a module may import
DEFAULT_VAL and in the interim if the code is reloaded or unpickled from somewhere else the identity check of
if param is DEFAULT_VAL can fail. This is a particularly nasty edge case that I don't typically find people talking about much. I think part of the reason it's not talked about much is that when you use
None as the default parameter you don't run into these edge cases because the
None object is a special singleton object that is created only once and is done so in a special case in the initialization phase of the Python interpreter1.
Creating sentinel objects that can both be properly annotated and also deal with serialization and other edge cases, preferably without sacrificing a lot of performance, is tougher than it first appears. This is why PEP661 exists, being able to deal with properly at the language level would be very useful and would bring consistency between projects. Hopefully this support lands in the main language soon.
I think that using explicit type hints in Python is almost always worth doing because doing so allows you to leverage powerful error checking tooling. Blog posts like this one about mypy and error handling show the powerful gains you can get from using the mypy type checker and covers some of the difficulties that you can encounter along the way. Most of the "difficulties" with better typed code is that it forces you to fix otherwise silent bugs, the extra upfront effort is often that it forces you to write correct code from the beginning but as is the case in software engineering the sooner you find a bug the cheaper it is to fix.