Thursday, June 5, 2014

Sci-Fi Author Roger Williams at SpaceTimeMind

One of the projects I've been highly absorbed in lately is the new podcast and video series, SpaceTimeMind, that I'm co-hosting with Richard Brown. There's a lot of overlap in themes between SpaceTimeMind and the Alternate Minds project. See, for instance, our 5th episode, Transhumanism and Existentialism. Especially pertinent is our latest installment, our interview with Roger Williams, author of The Metamorphosis of Prime Intellect (discussed previously here and here).

Monday, January 6, 2014

Searching the Internet for evidence of time travelers

Wibbly-wobbly timey-wimey.
Two physicists from Michigan Technological University, Robert J Nemiroff and Teresa Wilson, have written an article, Searching the Internet for evidence of time travelers.

Time travel has captured the public imagination for much of the past century, but little has been done to actually search for time travelers. Here, three implementations of Internet searches for time travelers are described, all seeking a prescient mention of information not previously available. The first search covered prescient content placed on the Internet, highlighted by a comprehensive search for specific terms in tweets on Twitter. The second search examined prescient inquiries submitted to a search engine, highlighted by a comprehensive search for specific search terms submitted to a popular astronomy web site. The third search involved a request for a direct Internet communication, either by email or tweet, pre-dating to the time of the inquiry. Given practical verifiability concerns, only time travelers from the future were investigated. No time travelers were discovered. Although these negative results do not disprove time travel, given the great reach of the Internet, this search is perhaps the most comprehensive to date.

(ht: Maureen Eckert)

Saturday, February 23, 2013

Roko's basilisk - RationalWiki

Roko's basilisk - RationalWiki

Roko's basilisk is a proposition suggested by a member of the rationalist community LessWrong, which speculates about the potential behavior of a future godlike artificial intelligence. According to the proposition, it is possible that this ultimate intelligence may punish those who fail to help it, with greater punishment accorded those who knew the importance of the task. This is conventionally comprehensible, but the notable bit of the basilisk and similar constructions is that the AI and the person punished have no causal interaction: the punishment would be of a simulation of the person, which the AI would construct by deduction from first principles. In LessWrong's Timeless Decision Theory (TDT), this is taken to be equivalent to punishment of your own actual self, not just someone else very like you.


In short order, LessWrong posters began complaining that merely reading Roko's words had increased the likelihood that the future AI would punish them — the line of reasoning was so compelling to them that they believed the AI (who would know they'd once read Roko's idea) would now punish them even more for being aware of it and failing to donate all of their income to institutions devoted to the god-AI's development. Thus, even looking at this idea was harmful, lending Roko's proposition the "basilisk" label (after the "basilisk" image from David Langford's science fiction stories, which was in turn named after the legendary serpent-creature from European mythology that killed those who saw it). The more sensitive on LessWrong began to have nightmares.

Thursday, February 21, 2013

Indefinite Survival Through Backup Copies

Indefinite Survival Through Backup Copies

Anders Sandberg & Stuart Armstrong

abstract: If an individual entity endures a fixed probability of disappearing ("dying") in a given fixed time period, then, as time approaches infinity, the probability of death approaches certainty. One approach to avoid this fate is for individuals to copy themselves into different locations; if the copies each have an independent probability of dying, then the total risk is much reduced. However, to avoid the same ultimate fate, the entity must continue copying itself to continually reduce the risk of death. In this paper, we show that to get a non-zero probability of ultimate survival, it suffices that the number of copies grows logarithmically with time. Accounting for expected copy casualties, the required rate of copying is hence bounded.


Related Posts Plugin for WordPress, Blogger...