In general, while doing code reviews, I have developed a strong notion of why the use of null (or nilNULLnullptr etc depending on your programming language) generally causes more problems than it solves.

The fundamental problem of null is that it is trying to represent the fact that it is not a value while being assigned as a value. This fundamental flaw then snowballs and manifests into problems that we see in everyday production code.

Here, I have made an attempt to document the various types of issues I commonly find with using null.

Poor language Decisions

Certain languages have not handled edge-cases of null type variables in a consistent manner. For example, in Java, you have primitive and reference types of variables.

Primitive variables do not allow null initialization. They throw a compile-time error.

int i = null; //this throws a compile error

error: incompatible types: <null> cannot be converted to int
       int i= null;

But an instance of the class Integer can accept a null initialization.

Integer j = null; //perfectly fine

Now, in Java, when you assign a reference type to a primitive type, it silently does a type conversion.

Integer p = 100;
int q = p; //works fine

If you try the same with an Integer instance initialized to null , you will now get a runtime exception.

Integer i= null;
int j = i; //throws a runtime exception

The above, when executed, throws an exception:

Exception in thread "main" java.lang.NullPointerException
	at Main.main(HelloWorld.java:5)

This kind of inconsistency can be extremely confusing, and lead to difficult to track bugs being introduced to the codebase.

#java #javascript #software-engineering #programming #software-development

We Need to Stop Using null: Here’s Why
1.35 GEEK