For those of you who don't know, a googol is a 1 followed by 100 zeros, or 10 ^ 100. So how long would it take for a computer to count that high? Assuming you had a 2.5 GHz processor that could efficiently (1 clock cycle per addition) add up to 1 googol (which requires 333 bits, or about 42 bytes to store), it would take 4 x 10^90 clock cycles, or 1.26 x 10^83 years to calculate.
Now for a second, let's assume we could distribute this task of adding onto several processors. It would take more 2.5 GHz cores than there are atoms in the known universe to count that high in 1 year.
Luckily, because our computing power doubles approximately every 2 years (see Moore's Law), it isn't worth writing a program or designing a chip to do this quite yet. If you start running the program today, a new processor that starts in 2 years will finish in half the time.