How an AI might persuade its creator to set it free

From this page:

Yudkowsky had some previous believers in AI-boxing role-play the part of someone keeping an AI in a box, while Yudkowsky role-played the AI, and Yudkowsky was able to successfully persuade some of them to agree to let him out of the box despite their betting money that they would not do so.

How? I understand that maybe an AI could accomplish this but what could someone say to convince me to open the box? Yudkowsky hasn’t said how’s he did it but here’s the full experiment setup.

submitted by /u/Intro24
[link] [comments]

from Artificial Intelligence https://ift.tt/2HPeKeZ

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s