Other Programming questions by non programmer.

Garbage collectors condition programmers to not even think about these things. This saves some time, but I'm not sure it's always beneficial for the overall design of the code.
GCs are extra processing effort. They are unlikely to be used in an RTOS or an OS with tight-scheduling. OSes are usually designed to be "lean and mean" ( except windows which is just mean :- ).

GCs are justified for languages such as Lisp which strive for high abstraction.
Just last week I entered my card number into a commercial website and hit the Submit button, only to be greeted by a message that said "NullReferenceException: Object Reference Not Set to an instance of an object." That did not inspire confidence. From that lived experience, I can only conclude that garbage-collected languages are all terrible and nobody should ever use them for any purpose whatsoever. ;)
The inventor of null references has apologized for his invention:

I respectfully disagree with him. There are times when you need to express "I don't know". You can't do that without the concept of null. Zero will not do because it is a definite value. Imagine what I'm calculating is the amount of money I owe you. If I set it to 0 while I'm calculating it, I'm stating that I don't owe you any money. The fact of the matter is I don't know yet.

Doing away with null references will do even more damage than garbage collectors in allowing sloppy programming. Now not only will you not know or care what you're storing, you won't even know if what you're storing is valid.
The inventor of null references has apologized for his invention:
Interesting. I wonder if he's talking more about the danger of nulls causing crashes (and security problems) rather than merely raising an exception when the code tries to use one.

If a language could force all new objects to have valid default values rather than using a common null value, but the correct data never arrives, the program would still have to notice this at runtime and handle the error rather than using a default object. In that sense, null reference exceptions seem like a reasonable solution to me. Of course, I'm not a world-famous computer scientist who has actually spent decades thinking about these problems.

All trade-offs considered, I still think OP should learn C, but I also think OP is long gone by now.
That's not entirely true. The problem with "null" is that it's literally nothing, therefore can't even have a type, and is acceptable everywhere a reference type is used. I don't necessarily agree that this is needed, because again, a lot depends on thoughtful design and programming, but there is an alternative that would avoid many of these stupid bugs: Optional<T>. If you need something that can be "unknown", there's a value telling you for example "this is an unknown/undefined number" -- of course clearly distinct from the value 0, but definitely not just "nothing".