I have often found that, getting into the final details in solving a problem makes most of its related problems simple enough. Especially, in a professional environment, people usually skip the details and try to focus on a given problem at hand. The issue with that approach is that, though it solves a short term problem, each new problem that arises thereafter looks like a different one to solve.
I feel, engineers need to think through issues and get to the root. For example, when I was at school, I remember the time when we wrote a program to simulate a clock. When the problem was posed, the straightforward answer was to draw a circle. Divide 360 degrees by 60 and draw a line that connected the center of the circle to the edge of the circle that is at position 0 degrees. For every 60 seconds, erase the current line and draw a new line at a position which is advanced by 6 degrees and so on. This was the minute line. The hour line will advance based on the minute line advance. Remember, you should remember to redraw the hour-line when it gets crossed over by the minute line. Things look simple so far. If you want to get fancy, you could add a seconds line which will progress every 1/60 of the second. This is as much as I got when I was at school.
When I got to college, I learnt that the computer has a clock on its own. You dont really have to compute time, but rather use the clock fields from the processor and just display it in a fashion that was described before. Later, I realised that it takes some time to draw the picture. So, there is a time between when I sample the time to the time I display it. Usually, it is of the order of sub-milli/micro seconds and may not matter to the human eye. Since we do sample the time from the common clock-source every time, we will be off almost consistently assuming the delay to draw the picture is almost the same everytime. In the program written at school, these sub-microseconds could have built up over time to form a second in which case, the clock will be off by a second. It only takes few days for this ambiguity to build up. That is not good.
As we think about it more, we could now question why we think that the cpu-clock is a reliable one. How does it operate? What are the ambiguities that are built into that? CPU clocks operate based on crystal oscillators. A crystal oscillator is a device which is based on the oscillation of quartz crytals which takes a voltage and generates precise frequencies which in turn could be divided to get the frequency corresponding to seconds/minutes. But, if there are variations in voltage or temperature, it will have corresponding impact in the clock generation. However, if the original frequency, which is of the order of Mega Hertz gets converted to seconds, the real percentage error becomes very unnoticeable. With the best crystal oscillators, the errors have been recorded to be 10^-9 seconds per day. What do we do now if we want to do better ?
Atomic oscillators come to the rescue. Cold atoms transition between different energy levels and their radiation level is used to generate clocks. The international standards definition being, a second is equal to 9,192,631,770 cycles of radiation of a Caesium atom, which by the way has an error of 10^-10 seconds per day. The NIST predicts that as technology improves, the best we could get is 0.01 nano-second error (10^-11 seconds per day) by 2010. For all practical purposes, one could argue that, if a given clock is only going to be used for, say 10 years, it can accomodate .2 millisecond variation a day and still not show a false time because rounding up of 1 second will happen only in 10*365.25 days. So, an engineering decision could be different based on requirements. It is also interesting to note that there is no perfect clock. It can also be inferred how much time has been discounted due to error so far in the evolution of earth. Retrospectively, at a very high level, we might just consider the clock as something that counts seconds and hours, while, in reality, it has much more to it. Just knowing the details, can give great insights onto novel solutions. Solving problems quickly is good. But, sometimes, stepping back and looking at the big picture in its entirety changes perspective and enables one to not only be innovative, but also to be highly productive.