416
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 02 Aug 2023
416 points (96.0% liked)
Technology
59086 readers
2311 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
There is still heat generated by the act of computation itself, unless you use something like reversible computing but I don't believe there's any current way to do that.
And even then, superconducting semiconductors are still going to be some ways off. We could have superconductors for the next decade in power transmission and still have virtually no changes to processesors. I don't doubt that we will eventually do something close to what you describe, but I'd say it's easily a long way off still. We'll probably only be seeing cheaper versions of things that already use superconductors, like MRI machines.
I appreciate you revising your reply to be less harsh, I wasn't aiming to correct you on anything I was just offering some thoughts, I find this stuff interesting and like to chat about it. I'm sorry if I made your day worse, I hope things improve.
I said superconducting semiconductors as just a handy wavy way to refer to logic gates/transistors in general. I'm aware that those terms are mutually exclusive, but thats on me, I should have quoted to indicate it as a loose analogy or something.
The only thing I disagree with is your assessment that computation doesn't create heat, it does. Albeit an entirely negligble amount, due to the fact that traditional computation involves deleting information, which necessarily causes an increase in entropy, heat is created. It's called Landauer's principle. It's an extremely small proportion compared to resistive loss and the like, but it's there none the less. You could pretty much deal with it by just absorbing the heat into a housing or something. We can of course, design architectures that don't delete information but I'm reasonably confident we don't have anything ready to go.
All I really meant to say is that while we can theoretically create superconducting classical computers, a room temperature superconductor would mostly still be used to replace current superconductors, removing the need for liquid helium or nitrogen cooling. Computing will take a long time to sort out, there's a fair bit of ground to make up yet.
I think "rounding error" is probably the closest term I can think of. A quick back of the envelope estimation says erasing 1 byte at 1GHz will increase an average silicon wafer 1K° in ~10 years, that's hilariously lower than I'm used to these things turning out to be, but I'm normally doing relativistic stuff so it's not really fair to assume they'll be even remotely similar.