Took me 2 hours to find out why the final output of a neural network was a bunch of NaN. This is always very annoying but I can't really complain, it make sense. Just sucks.
That could be a nice way. Sadly it was in a C++ code base (using tensorflow). Therefore no such nice things (would be slow too). I skill-issued myself thinking a struct would be 0 -initialized but MyStruct input; would not while MyStruct input {}; will (that was the fix). Long story.
As I was coding in C++ my own Engine with OpenGL. I forgot something to do. Maybe forgot to assign a pointer or forgot to pass a variable. At the end I had copied a NaN value to a vertieces of my Model as the Model should be a wrapper for Data I wanted to read and visualize.
Printing the entire Model into the terminal confused me why everything is NaN suddenly when it started nicely.