From this page:
Yudkowsky had some previous believers in AI-boxing role-play the part of someone keeping an AI in a box, while Yudkowsky role-played the AI, and Yudkowsky was able to successfully persuade some of them to agree to let him out of the box despite their betting money that they would not do so.
How? I understand that maybe an AI could accomplish this but what could someone say to convince me to open the box? Yudkowsky hasn’t said how’s he did it but here’s the full experiment setup.
from Artificial Intelligence https://ift.tt/2HPeKeZ