Never.
wait, they shut off? who knew.
I had one Linux server that was up for over 500 days. It would have been up longer but I was organizing some cables and accidentally unplugged it.
Where I worked as a developer we had Sun Solaris servers as desktops to do my dev work on. I would just leave it on even during the weekends and vacations, it also had our backup webserver on it so we just let to run 100%. One day the sys admin said you may want to reboot your computer, it's been over 600 days. 😆 I guess he didn't have to reboot after patching all that time and I didn't have any issues with it.
Prod environments typically don't have downtime. Save for patching every quarter that requires a host reboot.
Whenever there is a proxmox kernel update. Every few years to dust them If i get new hardware.
Mine are running all of the time, including during power outages, and are only shut down for physical maintenance and reboot for software maintenance.
This is a little variable through. Windows hosts tend to require more frequent software reboots in my experience. About once a year, I physically open each device and inspect, clean dust (fairly rare to find it for my setup though), and perform upgrades, replace old storage devices and such. Otherwise I leave them alone.
I usually get about 5-7 years out of the servers and 10 out of networking hardware, but sometimes a total failure occurs unexpectedly still and I just deal with it as needed.
Mine chug along 24/7 only a restart for updates
They have an off switch? who knew.
… shutting down?
I have a five 9s SLA with the wife for Plex.
Changes rarely get approved anyway.
She likes to sweat those assets.
Lol. 236 days and 107 days since the last reboots of my two servers.
Out of 6 Cisco servers 3 have auto power on at 7am and auto shutdown at 11 pm. Other 3 are 24/7
Whenever regular patching necessitates a reboot. Typically once a month.
Once a year for firmware updates. But my unraid box usually needs reboots once a month to stay stable.
Never! I have 2 mini pcs in separate locations running 24/7. One for downloading content, and running a DNS server/dynamic dns. The other for point-to-point VPN to access multiple NVRs that are blocked from the WAN itself. Luckily they both sip little power!
I set up a cron job to reboot once a day. Its for my security cameras and I want to ensure access. But, if you dont have issues, you dont need to.
I mean, so far the longest uptime I’ve seen at my current job is 9 years. Yes, that host should be patched. But given its role, and network access, it’s fine. Running strong. It is in a DC. Server grade hardware is designed with 24/7 operation in mind. Consumer hardware might be fine, but wasn’t designed with 24/7 critical operation in mind.
At home, I have some nucs on 24/7, and a r740 and nx3230 on 24/7. The rest is for true lab env and I only power on as needed.
Even though live kernel patching is a thing, I generally do a full reboot every month or two for the next big patch.
Full shut downs? Are we upgrading them, dusting them, or doing any other maintenance to them? That would be the only case besides UPS failure or power outage.
Uhhhhh, never
What are you guys even running that needs to be on?
I just got a Dell R510 and a HPe Proliant 360 g7, installed esx on them, but i cant find anything that would justify running them for 24/7.
I mean, besides a nas that holds some files.. i cant find anything worthy.. can only think about enterprise purposes which i dont meed at home.
So, to answer the question, they are always off untill i want to experiment
Only when I'm installing/removing hardware. Probably like once a year on average.
Almost never since getting a whole home generator.
Only shut down for maintenance if hardware breaks. Otherwise reboots are done to update firmwares, esxi.
Most things stay up 24/7
I have a couple machines I don't currently use for anything so they're powered off until needed.
If it is a Windows 95 server then every three days. Format and reinstall once every three months.
I have two hosts: raspberry pi that serves as a pi-hole and as a log of infrequent power outages, it goes 24/7, often with 100+ days of uptime (seeing the "(!)" sign in htop is so satisfying) and a SFF that shuts itself off nighty, provided nothing is happening on it (power is expensive).
My stuff is pretty low powered so it runs 24/7 except one old machine I use as a last resort offline backup that I boot and sync to every few months.
Everything in my lab is up 24/7 unless my UPS shuts it down in a power outage, if I'm doing any work inside the chassis or if I'm updating something. If you can handle the power bill, no real harm keeping it online all day.
Shut down? Never, reboot when necessary.
old 486 slackware 4.0 server I had on a big UPS made it through several dorm/apartment moves without a shutdown. Something like 7 years of uptime when I finally retired it.
shutdown? - never :D
My optiplex 9010 sff is what I use for experimenting with services and as a staging area for moving VMs to my main lab because it's air gapped. At max load it runs at 140w but it has a GTX 1650 that I use for gaming as well.
Otherwise the rest of my lab is only turned on when I'm using it or forget to turn it off when I leave the house. When I get a laptop again I'll leave it on more. None of it is more than $150 to replace though. It's a Hyve Zeus, Cisco isr 4331, and a catalyst 3750x so nothing heavy, just a little loud.
When I’m adding hardware or decide to blow out my pc equipment (which is way less than I should). I have dogs and cats and their hair gets everywhere.
Never really shit my mini pcs down, sometimes I restart a proxmox node if I want it to use an updated kernal but that's it. I don't run large servers at home
Usually I reboot once a year, but in reality power outages limit uptime to about this anyway.
It depends. I don't run anything public facing so security updates that need reboots are less of a concern to me
My Windows servers are rebooted once a month for patches. My Linux servers maybe once every couple months for kernel patches or if I screwed something up. My physical proxmox hosts? Twice in the last year. Once because I moved. The other time because I upgraded to proxmox 8.
Power failures, hardware upgrades.
You can turn host machines off? Who knew.
Seriously, mine only get switched off if hardware breaks or needs reconfiguring.
I shut down my NAS after work because I tend to not use it's services outside yet and saving like 2/3 of a day in electricity is worth it. For the machines that provide services like networking and security they run on UPS 24/7 up until there is a need to update or a UPS has a failure
Only when I swap or upgrade internal hardware.
These run 24/7/365.
You shutdown servers? I guess when I clean out dust 🤷♂️
My chassis has 7 blades in it, and I typically only keep 4 powered on. However, I patch them regularly, requiring reboots, but I don't have to take any VMs down with DRS.
Today marks the first day in about 2 years that my hosts will be shut down on purpose. Running new electrical circuits for the rack.
Previous shutdowns have been like weather related power outages and such.
You don’t (and generally shouldn’t) reboot servers. People got this idea that PCs needed to be rebooted because Windows is trash and becomes more unstable the longer it runs. Server OS’s dont have this problem.
I have several ESXi hosts, which automatically turn off and on as needed by vCenter based on server loads.
Otherwise, I don’t turn anything else off.
Summer every day in the afternoon for heat and power usage (time of use bills triple from 3-9pm). Scripted to run on one host per site for must have apps.
Winter - once a month for the weekend after patch Tuesday. It’s a chance to check for cables being nibbled/cleaning/other things needing doing.
Ideally I don't.
mine is small and idles at 17 watts, but i’ll shut it down if i don’t use for many days. also when i’m on vacation.
Sometimes I don’t need all the things running so I’ll kill a few pi’s and disks
Homelab
Rules
- Be Civil.
- Post about your homelab, discussion of your homelab, questions you may have, or general discussion about transition your skill from the homelab to the workplace.
- No memes or potato images.
- We love detailed homelab builds, especially network diagrams!
- Report any posts that you feel should be brought to our attention.
- Please no shitposting or blogspam.
- No Referral Linking.
- Keep piracy discussion off of this community