Costs

From PrgmrWiki
I need to know how long to keep a piece of hardware in production before replacing it.

Assume that hardware repair costs don't go up with age. (they do, but I'm starting with a simple two-variable equation, and will layer on complexity after that.) Assume a standard server that uses 146KwH per month (that's around 200w constant. ) Assume it costs USD$3K. Assume that at any point, I can buy a server with the same power draw for the same price, but it's productive work output will increase by moore's law. so, uh, let K be the cost per KwH.

Let C be compute units. [1]

let X be the months before we replace a server bought now.

let M be monthly cost

let T be time - months from when C==1;

so C= 1.18 * t

so monthly cost of keeping a server running is 146 * K + 3000 / X

But we don't care about how many servers we are running, we care about C, compute units. And we are trying to solve for X when given K

so, uh, we want to minimize the total cost per compute unit. m/c no?


to expand (146 * K + 3000 / X) / c




[1]measured where a server at the starting point is 1. a new server 18 months later would be 2, and 36 months in would be 4, and so on, so one server 36 months in would be worth 4 servers at the starting point; for this we are ignoring the difference in hard drive and cpu and ram growth, and just saying that servers get 'better' at moore's law. Business math, right?

So