Originally Posted by
Mongius
There is a big difference between server farms and Closed ATX/ITX systems. Server rooms are open racks that which are some meters high in a vast hall full of it. They are equipped with one SoC, about 8HDDs and a low energy cpu. they consume about 300W of power. I don't know actual numbers but I assume they reach a temperature of ~60°c max. Not much, some GPU fans don't even start spinning at these numbers. Storage servers don't even use a dedicated GPU. Just get a 24/7 air conditioning system and you're fine.
Now an ATX system is compact in a closed case and consumes much more power. They need a direct airflow system to pull fresh air and extract used air constantly. Now if you put a fan next to it that just whirls around the used air inside the case, making it not unable but much harder for the system to get that used air out. Resulting in higher temperatures and shortening the lifespan of your components. It is not a big deal if in a server farm some systems die, but if your one and only pc dies this is gonna be a big hastle.
Not sure what farms you work on, but in my experience that's not really true.
Servers come in a variety of form factors, yes some are ATX/ITX but many (if not the majority) are other form factors such as blades, special 1RUs, etc. Some might be SoC but again, it depends, many aren't. The number of HDDs varies too, some have none, some have upwards of 20. Power usage varies just as much, from 100W or less upto 2000W on some larger systems, not to mention blade chassis which might use even more (I've seen some big render farms too, now idea how much power a box with 8 quadros draws, but probably not a small number).
In my experience many farms are kept farrrr below 60c (a certain server once reached 50c and it was not a good situation). I would say that they are closer to 10-20c, I've always felt the need for a jumper when working for any length of time - and a pair of earplugs/muffs too, the vast majority of servers I've worked on have had very high cfm fans, on the little baby 1RUs and blades this means 80mm or smaller fans pumping upwards of 150cfm, they are very loud. Airconditioning is required 100% of the time, but that's in addition to the servers' fans. Additionally on closed systems aircon is ducted right into the roof of the case, where as open systems will use hot/cold aisle.
If it's a particularly hot day or there's aircon/server fan troubles then a big blower will be pointed directly into the server. The kinds of blowers I'm talking about are like this;
Typically we would connect the intake to a duct, then point the blower directly into the affected server.
In farms I worked in, it would be a huge deal if any system failed. All of them were fenced and had failover setup, but any single server might cost well over $100,000. It's not something you can casually lose. Even losing a disc in an array is horrifying tbh.
In a consumer case you can definitely use a large fan pointed directly into the case to create high static pressure, air will naturally flow out any available hole, the situation you describe where you have air circulating within the case is much less likely to happen if you are using an outside fan. Right now I'm actually using 1 HSF with 120/82 push/pull in a packard bell case (15-20 years old iirc) - no exhaust fan or intake fan. There is no static pressure within the case, only on the hsf. A setup like this does not perform as well as having static pressure in the entirity of the case, but even it performs fine.