I call it my billion-dollar mistake…At that time, I was designing the first comprehensive type system for references in an object-oriented language. My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn’t resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. – Tony Hoare, inventor of ALGOL W.
But, why null is so evil?
- NULL indicates a distinct state, and any type can have this state. Rather than defining a new type for each state, an entity needs to express. We confuse our abstraction by this rather odd generalized value.
- It's hard to reason about NULL when debugging.
To find the NULL-related bugs. You will have to determine which state the program was in and what leads to this state.
- Once NULL references are used as valid states for parameters as well as return values. We have to duplicate NULL reference checks in various places.
Taming the NULL Enters, the Null Object Pattern
The root cause of the troubles null introduces. Is it uses a generalized value to represent a distinct state?
Why not use NullableObject? It's more work to create NullObject class for each interface with which you want to represent a special state.
But the specialization is isolated to each special state, rather than the scattered if statements.