A quick measure of the costs of running on-premise computers.
As part of my application of the AWS Well Architected Framework to my own infrastructure I’ve been digging into costs a bit. I have run home servers for years. A LOT of years. It’s basically tied to my identity. Of COURSE I have some linux servers at home. Geesh, who do you think I am?
Well, a bit dumb actually. This week I got my new watt meter and took some measurements. The server I used for monitoring my network and syslog aggregation used about 80W. The server I had been using for DHCP and DNS used about 75W. I replaced DHCP/DNS with a small solid state machine that only uses 15W. I also am using that for syslog aggregation and am sending the logs off to AWS CloudWatch.
I’m in California, and that means PG&E is my energy provider. I have a tiered cost structure with the first 500 kWh costing $0.15/kWh and the next 500 costing $0.20/kWh. The last year I’ve averaged 641 kWh each month - putting me into tier 2 for some of it. The two servers accounted for 110 kWh of that - a very surprising 17%. If I had only known I’d have dumped them a long time ago. They were costing me around $22 per month, on average. That’s stupid. EC2 instances would not have cost me anywhere near that.
Net net: making these changes is not only enabling better tooling, it’s going to save me money. I don’t have cost estimates on the AWS side (CloudWatch at least) yet but it’s hard to see how I’d spend $22 per month. $4 maybe. Not $22.
I need to double check my numbers but this is surprising. It’s dumb to consume what you don’t really need, and it’s dumb to toss money down the drain. I’m glad I did this exercise. My self-identity as a linux-geek and running my own servers in my house are not worth $22 per month.
Data. Make decisions from data. Not from emotion. At least not about money!