Problem Detail: In one of the text book its mentioned that ‘running time of this algorithm is 200 computer years’. Can somebody please explain what is the meaning of a computer year?
Asked By : Kumar
Answered By : adrianN
Computer years, like man hours, are a measure of work. A computer year is the work one computer (of some agreed-upon speed…) performs in a year. 365 computers working in parallel for a day also perform one computer year of work.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/11220