The thing that fascinates me is that every single digital microwave I've ever used behaves the same way, and allows the "seconds-place" to be 0-99.
My best guesses are
- There's some ASIC that's been around forever and everyone uses it (a cockroach chip like the 555)
- The first digital microwave did this and all subsequent ones followed
- There's actually some implementation reasons why this is way more sensible.
Writing it in software, there are different ways that folks would probably implement it, for example, "subtract one, calculate minutes and seconds, display" seems reasonable. But nope, every one I've ever used is just the Wild West in the seconds department.
I don't think it does---I think OOP is doing the math and then inputting the sum.