In Making Wrong Code Look Wrong (2005), Joel Spolsky argues people outside Microsoft disliked Hungarian notation because they misunderstood it. He begins with cleaning a giant greasy Israeli bread factory, which of course symbolizes how Hungarian notation told you what kind of thing a variable is, in a way that goes beyond the number of bytes it occupies, which was all the language's type told you. For instance:
In WYSIWYG word processing, you have scrollable windows, so every coordinate has to be interpreted as either relative to the window or relative to the page, and that makes a big difference, and keeping them straight is pretty important.
So give all coordinates relative to the window a little "rw" prefix so people know not to mix them up with coordinates relative to the page. Unfortunately, people outside Microsoft didn't understand the point of Hungarian notation: Page vs window could not be encoded in the type system itself, whereas prefixes like "l" for long which were totally redundant since the type system already tells you the variable is a long. The latter became popular until people realized it was a useless pain and gave up on Hungarian altogether. But, Spolsky writes, now that you understand the distinction, you can use it to make wrong code look wrong. Prevent subtle bugs by encoding in prefixes what the type system can't.
...In 2026, though, I wonder if we read this and say ah, this was just the Microsoft Word and Excel teams working around the lack of a powerful type system? My type system (eg in Gleam) can totally say a coordinate is relative to the window. More importantly, my compiler and IDE automate the bug catching. Not so with Hungarian, even done right:
The compiler won’t help you if you assign one to the other and Intellisense won’t tell you bupkis. But they are semantically different; they need to be interpreted differently and treated differently and some kind of conversion function will need to be called if you assign one to the other or you will have a runtime bug. If you’re lucky.
Hungarian notation made it easy for humans to smell when prefixes look wrong, but today your type system lets the compiler check if coordinates are relative to page or window. Which is better because you no longer need to do that work of interpreting the prefix.
That said, the jump from no prefix to prefix probably adds more help than the jump from prefix to compiler check, and how much mileage you get out of type safety probably depends on what your project is doing. Other factors like sleeping well probably matter more, so I don't want to oversell language benefits. Still, it sounds like the Word and Excel teams might have had a real need for a stronger type system. But instead of straightening everything out rigorously, they just slapped some prefixes on variables and released world changing office tools that made everyone billions, what losers!