Communication Breakdown Open Source Community

Full Version: WARNING: Just Reading About This Thought Experiment Could Ruin Your Life
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
A thought experiment called "Roko's Basilisk" takes the notion of world-ending artificial intelligence to a new extreme, suggesting that all-powerful robots may one day torture those who didn't help them come into existence sooner.
 
Weirder still, some make the argument that simply knowing about Roko's Basilisk now may be all the cause needed for this intelligence to torture you later. Certainly weirdest of all: Within the parameters of this thought experiment, there's a compelling case to be made that you, as you read these words now, are a computer simulation that's been generated by this AI as it researches your life.
 
http://www.businessinsider.com/what-is-r...?op=1&IR=T
 

Guest

[Image: pascals-wager-11.png]

 

Look down the full monty hall....  

 

:wink:

 

 

 

 

 

Guest

[Image: img_20150118_191137-e1421608523612.jpg?w=720]

 

Guest

:Laughing-rolf:

 

[Image: 7be862001010b2ea2ba5e3a2fb38504eddd0486b...84c8b7.jpg]

 

 

Guest

Quote:[Image: pascals-wager-11.png]

 

Look down the full monty hall....  

 

:wink:
 

Yeah, well! Just be careful you don't end up worshiping god in a box!
 
 
[Image: ZfcNcu4.gif]
 

Guest

Reading spam can ruin this whole experiment

Guest

Quote:Reading spam can ruin this whole experiment
 

Confusedmokesmall:
 
spam Spam SPam SPAm SPAM SPAM SPAM, lovely SPAM; Wonderful SPAM…
 
http://forum.chickensomething.com/index....%E2%80%A6/
 
:Icon_lol2:

Guest

If you do not subscribe to the theories that underlie Roko’s Basilisk and thus feel no temptation to bow down to your once and future evil machine overlord, then Roko’s Basilisk poses you no threat. (It is ironic that it’s only a mental health risk to those who have already bought into Yudkowsky’s thinking.) Believing in Roko’s Basilisk may simply be a “referendum on autism,” as a friend put it. But I do believe there’s a more serious issue at work here because Yudkowsky and other so-called transhumanists are attracting so much prestige and money for their projects, primarily from rich techies. I don’t think their projects (which only seem to involve publishing papers and hosting conferences) have much chance of creating either Roko’s Basilisk or Eliezer’s Big Friendly God. But the combination of messianic ambitions, being convinced of your own infallibility, and a lot of cash never works out well, regardless of ideology, and I don’t expect Yudkowsky and his cohorts to be an exception.
 
http://www.slate.com/articles/technology..._time.html
 

Guest

Quote:Reading spam can ruin this whole experiment
 

I doubt that. A.I. knows spam pisses people off. It would use it accordingly as a punishment. Besides, there are worse things than spam floating around in cyberspace.

[Image: slide_2.jpg]

Pages: 1 2