We know some computers use 64-bit words.

2^64 is approximately 1.8 x 10^18 - that’s a pretty large number.

So in fact if we started incrementing a 64-bit counter once per second at the beginning of the universe (20 billion yrs ago), the MSB’s of the counter would still be all zeroes.

*~ Excerpts from a book on Digital Signal Processing *

The original meaning of “decimation” was that the Roman general killed one out of every ten centurions, as a way to instill order through fear.

The same concept is used in the concept of Decimation used often in Signal Processing & VLSI terminologies. In signal processing decimation implies downsampling. In decimation we take 1 out of ‘N’ samples where N is the decimation factor. For example if we have a 100 MHz sampled data decimated by a factor of 10, then we take 1 sample for every 10 clocks of 100 MHz so the output is 10 times slower at 10 MHz. It is also used while describing clock domains.