My memory of the exacts here are fuzzy, but I think this depended on whether or not your TV picked up digital signal, analog, or both. I remember around that time we had a TV that would pick up static on some channels and have a blue input screen on others.
Yeah, for instance the semi-ubiquitous "small TV with a vhs player built in" that was in a ton of mini-vans and kids' rooms well into the early 2000s only supported analog cable/antenna signals, so it would give the black and white static when there was no signal.
I remember back in the Wii days when I was young we had a flat screen that would go to the digital pattern with no input. However sometimes once in a while it would get that static loud no signal so I think mine had both
I don’t really have a point here just wanted to share
The way digital works you would either get a "No signal" indicator (because the circuitry detects the signal to noise ratio is too low) or squarish artifacts (because of the way the compression algorithms for digital video are designed).
Technically, it's not about the display technology, but instead about the signal/tuner. More specifically if it's analog or digital. Some modern TVs still have analog or hybrid tuners for backwards compatibility and regions that still use analog, so they can display static. For instance, in Ukraine we finished the switch to digital TV only a couple of years ago. If your TV had no digital tuner (as was the case for many) you had to buy a DAC box. Retirees/pensioners got them for free, sponsored by the government.