I'm curious what everyone thinks. In SQL (at least in oracle), NULL transforms conceptually into "I don't know the value", so NULL = NULL is false. (Maybe this actually leads to NULL, which is then passed to false or something like that ...)
This makes sense to me, but in most OO languages, null means "no reference", so null == null should probably be true. This is the usual way to do things in C #, for example, when overriding Equals.
On the other hand, null is still often used to mean “I don’t know” in object-oriented languages, and implementing null == null in false can cause the code to be somewhat more meaningful for certain domains.
Tell me what you think.
language-agnostic oop
George mauer
source share