KASTE: Also at the party is Eliezer Yudkowsky, the 31-year-old who co-founded the institute. He's here to mingle with potential new donors. As far as he's concerned, preparing for the singularity takes primacy over other charitable causes.
Mr. ELIEZER YUDKOWSKY (Research Fellow and Director, Singularity Institute for Artificial Intelligence): If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.
KASTE: Yudkowsky doesn't have formal training in computer science, but his writings have a following among some who do. He says he's not predicting that the future super A.I. will necessarily hate humans. It's more likely, he says, that it'll be indifferent to us - but that's not much better.
Mr. YUDKOWSKY: While it may not hate you, you're made of atoms that it can use for something else. So it's probably not a good thing to build that particular kind of A.I.
Wendell Wallach and Colin Allen maintain this blog on the theory and development of artificial moral agents and computational ethics, topics covered in their OUP 2009 book...
Wednesday, January 12, 2011
Singularity on NPR
Martin Kaste on ALL THINGS CONSIDERED hosted a piece on the Singularity. The eight minute broadcast can be listen to here.
Subscribe to:
Post Comments (Atom)
1 comment:
great info bro
what is a robot
Post a Comment