The facts show that the average cost of IT systems downtime is close to $5,600 per minute according to the Uptime Institute Symposium.

In the recent study “Calculating the Cost of Data Center Outages” The Ponemon Institute identifies the true bottom-line costs of data center downtime as:

  • The average cost of downtime across industries was approximately $5,600 per minute.
  • The average reported incident length was 90 minutes, resulting in an average cost per incident of approximately $505,500.
  • For a total data center outage, which had an average recovery time of 134 minutes, average costs were approximately $680,000.
  • For a partial data center outage, which averaged 59 minutes in length, average costs were approximately $258,000.
  • The report cited the highest cost of a single event at about $1 million (more than $11,000 per minute).

(*The complete Emerson-Ponemon Institute report is available here.)

How costly is IT downtime in your organization?

A Simple Estimate:

Estimated Average Cost of hour of downtime = Employee Costs per Hour * Fraction Employees Affected by Outage + Average Revenue per Hour * Fraction Revenue Affected by Outage

  • Employee Costs per Hour: total salaries and benefits of employees per week divided by the average number of working hours
  • Average Revenue per Hour: total revenue per week divided by average number of open hours
  • “Fraction Employees Affected by Outage” and “Fraction Revenue Affected by Outage” are just educated guesses or plausible range
  • Since evaluating purchases, estimates are ok

Caveats

  • Ignores cost of repair, such as cost of operator overtime or bringing in consultants
  • Ignores daily, seasonal variations in revenue
  • Indirect costs of outages can be as important as these more immediate
  • Hence estimate tends to be conservative side

Real Example

Amazon 2001

  • Revenue $3.1B/year, with 7744 employees
  • Revenue per hour (24×7): ~ $350,000
  • If outage affects 90% revenue: ~ $320,000
  • Public quarterly reports do not include salaries and benefits directly
  • Assume avg. annual salary is $85,000
  • $656M / year, or $12.5M / week for all staff
  • @ 50 hours / week => ~ $250,000 per hour
  • If outage affects 80% employees: $200,000
  • Total: ~ $520,000 / hour

*for more examples and a more detailed explanation, see: A Simple Way to Estimate the Cost of Downtime –Dave Patterson, EECS Department, University of California, Berkeley and http://roc.cs.berkeley.edu/talks/pdf/LISA.pdf

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *