![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
[cw: (fairly strong) apocalypse, (fairly strong) death, (arguably) drugs, (arguably) discourse]
I woke up with a response to https://sigmaleph.tumblr.com/post/674125956243439616/i-feel-like-people-who-are-very-concerned-about-ai forming in my head, but when I went back and read it again I found that the post rapidly diverges from the part I wanted to respond to, to the point that it seems unfitting to respond in a reblog.
---
I'm not sure how much I endorse this, but hard drugs feel like an insufficiently out-there response to an out-there problem.
Consider a toy model where timelines fall into two categories:
1. Terran civilisation continues ticking along through the 2020s more or less intact. In 2030, it invents grey goo and annihilates itself.
2. A Carrington-class solar flare hits Earth during the 2020s, destroying the electrical grid. Many people die, but not *everyone*. In 2030, Earth is in no condition to invent grey goo and annihilate itself, and the survivors continue surviving.
Then, reasoning anthropically, *all versions of you that experience 2031 will have experienced a Carrington-class solar flare*, and you should treat a 2020s Carrington event as if it were certain. (The versions of you who get it wrong will be too dead to care.)
The real probability-space is more complicated, of course, but I think the idea holds: if most timelines result in your unavoidable destruction, assume you're in one of the timelines where your destruction is avoidable and plan around that. The yous who *do* turn out to be in avoidable-destruction timelines will thank you.
I woke up with a response to https://sigmaleph.tumblr.com/post/674125956243439616/i-feel-like-people-who-are-very-concerned-about-ai forming in my head, but when I went back and read it again I found that the post rapidly diverges from the part I wanted to respond to, to the point that it seems unfitting to respond in a reblog.
---
I'm not sure how much I endorse this, but hard drugs feel like an insufficiently out-there response to an out-there problem.
Consider a toy model where timelines fall into two categories:
1. Terran civilisation continues ticking along through the 2020s more or less intact. In 2030, it invents grey goo and annihilates itself.
2. A Carrington-class solar flare hits Earth during the 2020s, destroying the electrical grid. Many people die, but not *everyone*. In 2030, Earth is in no condition to invent grey goo and annihilate itself, and the survivors continue surviving.
Then, reasoning anthropically, *all versions of you that experience 2031 will have experienced a Carrington-class solar flare*, and you should treat a 2020s Carrington event as if it were certain. (The versions of you who get it wrong will be too dead to care.)
The real probability-space is more complicated, of course, but I think the idea holds: if most timelines result in your unavoidable destruction, assume you're in one of the timelines where your destruction is avoidable and plan around that. The yous who *do* turn out to be in avoidable-destruction timelines will thank you.
death tw?
Date: 2022-01-25 01:56 am (UTC)