195
submitted 1 year ago by Five@beehaw.org to c/technology@beehaw.org
you are viewing a single comment's thread
view the rest of the comments
[-] snor10@lemm.ee 20 points 1 year ago
[-] sci@feddit.nl 72 points 1 year ago

Imagine that you're locked in a room. You don't know any Chinese, but you have a huge instruction book written in English that tells you exactly how to respond to Chinese writing. Someone outside the room slides you a piece of paper with Chinese writing on it. You can't understand it, but you can look up the characters in your book and follow the instructions to write a response.

You slide your response back out to the person waiting outside. From their perspective, it seems like you understand Chinese because you're providing accurate responses, but actually, you don't understand a word. You're just following instructions in the book.

[-] tetris11@lemmy.ml 39 points 1 year ago* (last edited 1 year ago)

Its a thought experiment involving a room where people write letters and shove them under the door of the Chinese kid's dorm room. He doesn't understand what's in the letters so he just forwards the mail randomly to his Russian and Indian neighbours who sometimes react angrily or happily depending on the content. Over time the Chinese kid learns which symbols make the Russian happy and which symbols make the Indian kid happy, and so forwards the mail correspondingly until he starts dating and gets a girlfriend that tells him that people really shouldn't be shoving mail under his door, and he shouldn't be forwarding mail he doesnt understand for free.

[-] maeries@feddit.de 22 points 1 year ago
[-] 100years@beehaw.org 14 points 1 year ago

Wow, solid wiki article! It's very hard to say anything on the subject that hasn't been said.

I didn't see the simple phrasing:

"What if the human brain is a Chinese Room?"

but that seems to fall under eliminative materialism replies.

Part of the Chinese Room program (both in our heads and in an AI) could be dedicated to creating the experience of consciousness.

Searle has no substantial logical reply to this criticism. He openly takes it on faith that humans have consciousness, which is funny because an AI could say the same thing.

[-] FlowVoid@midwest.social 5 points 1 year ago* (last edited 1 year ago)

The whole point of the Chinese room is that it doesn't need anything "dedicated to creating the experience of consciousness". It can pass the Turing test perfectly well without such a component. Therefore passing the Turing test - or any similar test based solely on algorithmic output - is not the same as possessing consciousness.

[-] reflex@kbin.social 4 points 1 year ago* (last edited 1 year ago)

en.wikipedia.org/wiki/Chinese_room

Man, I love coming across terms like this.

Chinese Room, Chinese Walls, Dutch Treat, Dutch Uncle, Dutch Oven.

[-] ivanafterall@kbin.social 1 points 1 year ago

Wow! Me, too! What is a Dutch Oven!?

[-] shanghaibebop@beehaw.org 5 points 1 year ago
[-] reflex@kbin.social 1 points 1 year ago* (last edited 1 year ago)

Or a fart in a blanket :)

*Satisfied nod.*

[-] FlowVoid@midwest.social 1 points 1 year ago

A covered pot.

this post was submitted on 29 Jul 2023
195 points (98.5% liked)

Technology

37702 readers
302 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS