Most strongly-typed languages (C++, Java, C#, etc.) do not support NULL values for their primitive types (integers, floats, etc). There's quite a few ways to get around this limitation, from using an additional null indicator variable to wrapping the primitive type in a class, but the most common is using a "magic value" to represent NULL. Generally, this "magic value" is the smallest value possible, such as -1.79769313486232e308.

In Mihhon's case, the original coders decided to use a range of values to represent NULL. This led to the following function to test if a double is NULL:

static public final boolean isNull(double _value)
{
  return (Math.abs(_value)  < 0.0001);
}

To some, 0.0001 may seem like a pretty "large" cut-off point for NULL -- after all, double-precision numbers can get much, much smaller than that. But those numbers, as it turned out, were reserved for the VERY NULL ...

static public final boolean isVeryNull(double _value)
{
  return (Math.abs(_value)  < 0.0000000001);
}