23
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 01 Sep 2024
23 points (100.0% liked)
TechTakes
1427 readers
106 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
while radiating out waste heat at higher temp would be easier it'll also take up valuable power, and either i don't get something or you're trying to break laws of thermodynamics
I'm saying that we shouldn't radiate if it would be expensive. It's not easy to force the heat out to the radiators; normally radiation only works because the radiator is more conductive than the rest of the system, and so it tends to pull heat from other components.
We can set up massive convection currents in datacenters on Earth, using air as a fluid. I live in Oregon, where we have a high desert region which enables the following pattern: pull in cold dry air, add water to cool it further and make it more conductive, let it fall into cold rows and rise out of hot rows, condition again to recover water and energy, and exhaust back out to the desert. Apple and Meta have these in Prineville and Google has a campus in The Dalles. If you do the same thing in space, then you end up with a section of looped pipe that has fairly hot convective fluid inside. What to do with it?
I'm merely suggesting that we can reuse that concentrated heat, at reduced efficiency (not breaking thermodynamics), rather than spending extra effort pumping it outside. NASA mentions fluid loops in this catalogue of cooling options for cubesats and I can explain exactly what I mean with Figure 7.13. Note the blue-green transition from "heat" to "heat exchanger"; that's a differential, and at the sorts of power requirements that a datacenter has, it may well be a significant amount of usable entropy.
okay so you want to put bottoming cycle thermal powerplant on waste heat? am i getting that right?
so now some of that heat is downgraded to lower temperature waste heat, which means you need bigger radiator. you get some extra power, but it'd be a miracle if it's anything over 20%. also you need to carry big heat engine up there, and all the time you still have to disperse the same power because it gets put back into the same server racks. this is all conditional on how cold can you keep condenser, but it's pointless for a different reason
you're not limited by input power (that much), you're more limited by launch mass and for kilogram more solar panels will get you more power than heat engine + extra radiators. also this introduces lots of moving parts because it'd be stirling engine or something like that. also all that expensive silicon runs hot because otherwise you get dogshit efficiency, and that's probably not extra optimal for reliability. also you can probably get away with moving heat around with heat pipes, no moving parts involved
also you lost me there:
okay this works because water evaporates, cooling down air. this is what every cooling tower does
no it doesn't (but it doesn't actually matter)
and here you lost me. i don't think you can recover water from there at all, and i don't understand where temperature difference comes from. even if there's any, it'd be tiny and amount of energy recoverable would be purely ornamental. if i get it right, it's just hot wet air being dumped outside, unless somehow server room runs at temperatures below ambient
also i'm pretty sure that's not how it works at all, where did you get it from
and I’m over here like “what if we just included a peltier element… but bigger” and then the satellite comes out covered in noctua fans and RGB light strips