--- Log opened Mon Oct 25 00:00:13 2021 00:01 < maaku> Dune was surprisingly good 00:05 < maaku> lsneff: it won't go mainstream. to many people fundamentally don't want it 00:05 < maaku> first of all people actually want to die (wtf) 00:06 < maaku> second brain emulation offends the beliefs of a majority of people 00:07 < maaku> and if that weren't a problem, it's a small subset of fringe people who would see an emulation of themeselves as a continuation and not some frankensteinian contraption 00:33 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has joined #hplusroadmap 01:20 -!- dustinm [~dustinm@static.38.6.217.95.clients.your-server.de] has quit [Quit: Leaving] 01:32 -!- dustinm [~dustinm@static.38.6.217.95.clients.your-server.de] has joined #hplusroadmap 05:14 < lsneff> I think a lot of people felt similarly about longevity research, though perhaps to a lesser degree 05:18 < lsneff> I personally don’t agree with your last point, but I do see what most people intuitively believe it 05:24 -!- yashgaroth [~ffffffff@2601:5c4:c780:6aa0::f2f0] has joined #hplusroadmap 06:46 < kanzure> they are already frankensteinen contraptions 06:52 < lsneff> speak for yourself 07:09 < L29Ah> So he pulls an alternating-current taser on me and tells me that only the Official Serbian Church of Tesla can save my polyphase intrinsic electric field, known to non-engineers as "the soul." 07:33 -!- yashgaroth [~ffffffff@2601:5c4:c780:6aa0::f2f0] has quit [Quit: Leaving] 07:48 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has quit [Quit: bye bye] 07:49 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has joined #hplusroadmap 07:58 < juri_> yeah, i'm a watcher of sci-fi, and i HATE the deathist propaganda in star trek: picard. 07:59 < juri_> disgusting, to me. 08:14 -!- balrog [znc@user/balrog] has quit [Quit: Bye] 08:19 -!- balrog [znc@user/balrog] has joined #hplusroadmap 08:41 < superkuh> Yep. It was pretty sad to see "death is natural, death is good." in star trek. 08:48 < rndhouse> juri: How does Picard relate to deathist propaganda? 09:23 < rndhouse> ah ok 10:26 < maaku> lsneff: try to research this stuff from the perspective of someone who knows nothing about it. the first thing you find is long-winded explanations of why brain emulation / uploading works as longevity 10:26 < maaku> e.g. "wait but why" articles or existential comics 10:27 < maaku> that shows that most people exposed to these ideas for the first time really don't find the uploading argument intuitive 10:28 < maaku> (I happen to think they're right to think it's bogus, but that's beside the point here. You asked "how does it become mainstream?" The answer has to include overcoming this hurdle, somehow.) 11:37 < lsneff> You’re registered for cryopreservation, right? How do you hold those two views? 11:44 -!- spaceangel [~spaceange@ip-89-176-181-220.net.upcbroadband.cz] has joined #hplusroadmap 11:45 < docl> anticipating the possibility of direct repair 11:48 < docl> idk once you get to hard materialism and desiring immortality, it seems like more of a 50/50 split on uploading being ok vs direct physical continuity being needed. not exactly a small fringe on either side. 11:49 < docl> personally I feel differently on different days. uploading being ok seems more rational, but doesn't mean I'm fully indifferent to stepping into a copy/destruct transporter 11:50 < docl> the main problem is uploading seeming more feasible... at least, relative to the current options for cryonics (high estimated damage or aldehyde, no way to get the best of both worlds yet) 11:52 < docl> but there's some wriggle room in that high damage is merely an estimate (in an optimal case), as the particular circumstance is that with good vitrification you have a lot of dehydration and it creates conditions bad for EM imaging, whereas aldehyde is basically what they were doing for EM anyway but with a few more steps 11:53 < docl> also there's the fact that cryonics is a moving target depending how much research gets done in your lifetime. since there are no clinical trials it's potentially possible to get the state of the art 12:35 < lsneff> How does one define cryopreservation + complete reconstitution with nanobots as physical continuity? It seems to me like you either think physical continuity is required—in which you think both brain emulation and cryonics won't work—or you don't think physical continuity is required—in which you think either is possible. 12:37 < lsneff> If nanobots have to rebuild most neurons in the connectome from tattered lipid bilayers, I wouldn't really define that as physical continuity except if you mean the same atoms have to be used, which has no scientific basis. 12:45 < docl> depends on how tattered the libid bilayers are. patching a few holes would be different from reconstructing the whole thing atom by atom 12:46 < docl> vitrification doesn't lyse the cells that badly... my understanding is it's mainly that the cytoskeleton tends to collapse when you pull that much water out, and the high concentrations needed to vitrify result in some protein denaturation 13:06 < lsneff> I certainly don't think revival from cryopreservation is impossible, I just don't see why it'd be necessary. 13:07 < lsneff> I have yet to see a convincing argument why physical continuity as defined to include cryopreservation is necessary for successful revival of consciousness. 13:43 -!- xaete[m] [~xaetematr@2001:470:69fc:105::a438] has joined #hplusroadmap 13:50 < docl> I feel the same, but I can't necessarily convince anyone that something like captain kirk's transporter is safe to use based on empirical evidence alone. they might step through it a thousand times (whether to commit suicide, self sacrifice, under coercion, etc) and maintain that they become truly a different person each time despite every observable sign and subjective experience pointing to the 13:50 < docl> contrary. the question is whether one accepts identity as basically information pattern based or not. and accepting that implies things we don't ordinarily experience, like that two future branches of you might exist side by side. 13:59 * fenn gets out his identity bingo card 14:10 < docl> "Everyone who had serious philosophical conundra on that subject just, you know, died, a generation before." -- Corey Doctorow, Down and Out in the Magic Kingdom 14:11 < docl> Of course, that probably won't happen. People will still have kids, and we might achieve biological life extension... Upload-hesitancy could still be a thing until the heat death of the universe 14:12 < fenn> some people avoid blood transfusions and contraception 14:13 < fenn> seat belts, helmets, condoms, masks, steel toed boots 14:13 < fenn> if a form of personal protectio has been invented, you can find someone objecting to it 14:14 < fenn> worth noting that in doctorow's story they had regular backups, and dying wasn't taken lightly 14:14 < docl> also there's a power dynamic... if you want to be part of the first generation of superpowered machine beings it's hard to see mere cyborgs being as powerful as people who upload (depending how cheap computational power gets) 14:15 < docl> if you need a moon sized computer to simulate a human, maybe there's a case for having at least 1 meat brain in there 14:17 < fenn> at the very least they are quite energy efficient 15:09 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has quit [Ping timeout: 265 seconds] 15:15 -!- spaceangel [~spaceange@ip-89-176-181-220.net.upcbroadband.cz] has quit [Remote host closed the connection] 16:17 < lsneff> Anyhow, I think there are the right people working towards brain emulation. I know Adam Marblestone is working in that direction. Regulated preservation, in the way that Kenneth Hayworth would like at least, doesn't seem to be anywhere near however. 16:21 < lsneff> It doesn't seem like our models of neurons, particularly how neurons and synapses change over time from what can tell, are good enough yet though 16:49 < maaku> lsneff: you jumped from whole brain emulation to cryopreservation. that's a bit non-sequitur 16:50 < maaku> which is the crux of the issue. to my mind they do completely unrelated things 16:50 < lsneff> That's fair. I see them as intrinsically related, but I respect that you don't. 16:50 < maaku> but cryopreservation is always a mere plan B backup. the real goal is to achieve longevity escape velocity and not need it at all except in emergency circumstances 16:58 < maaku> also identity-is-the-computation-process-not-information doesn't mean giving up the same post-human upload future 16:59 < maaku> it just means you achieve it through ship of theseus incremental upgrades rather than a destructive copy-and-kill 17:16 < lsneff> I see what you mean. Have you heard of DHCA? (https://en.wikipedia.org/wiki/Deep_hypothermic_circulatory_arrest) 17:17 < lsneff> Surgical technique that first induces complete electrocerebral silence 17:18 < lsneff> The vast majority of people recover fully. It seems to imply that consciousness is not contingent on continual brain activity / computational activity. 17:46 < maaku> No it does not imply that. 17:47 < maaku> If you ask a copy "are you the copy or the original?" and it replies "the original" that does not make it so. 17:49 < lsneff> So, you think these people are no longer the same individual as before? 17:51 < maaku> I don't know. We don't have a comprehensive physical theory of this property which I'm concerned about. 17:52 < maaku> I'm 99% sure that the ST teleporter (if it existed) is a murder machine. But stuff like DHCA is very much a gray area. 17:52 < maaku> And cryonics for that matter. 17:54 < maaku> So why am I signed up for alcor instead of nectome? Because alcor's method possibly preserves the continuity of experience, whereas nectome's is 100% certain death. 17:54 < maaku> But the far better plan is to not die in either case. 17:56 < maaku> Do you understand how the statement "The vast majority of people recover fully" has absolutely nothing to do with this concern? 17:56 < maaku> (I'm not really clear what perspective you are coming from on this.) 18:01 < docl> so 99% for at ST transporter and maybe some other number for DHCA, like 10% or 90%? 18:02 < lsneff> My perspective is that there is no copy problem. If I went through a destructive scan process, the resulting emulation, if accurate enough, would be close enough to me to be considered me. If the scan wasn't destructive, we'd both have a common history and would both be "me". 18:05 < lsneff> I don't even consider continuity of experience to be important and I don't see how cryopreservation and revival counts as continuity of experience. 18:07 < lsneff> Could you explain to me why "The vast majority of people recover fully" is unrelated to your concern? The people consider themselves to be the same individual, is that not what's important here? 18:08 < maaku> docl: sure so long as it's understood those probabilities are statemetns about my/our ignorance. I think if we had a physical, verifyable theory of consciousness then those cases would be clear cut 18:09 < docl> well DHCA recoverees reporting being the same person are analogous to the star trek transportees saying it was them who stepped into the transporters -- except for part about all the particles being swapped out, of course 18:10 < maaku> lsneff: Let's explore it from a different angle. Suppose you came to your psychiatrist with an issue and he suggested a frontal labotomy to fix it. He says the vast majority of post-labotomy patients self-report being happy. Does this sell you on the procedure? 18:11 < docl> maaku: makes sense 18:16 < maaku> lsneff: Alternatively, let's say that magical fMRI improvements make real-time scanning possible, and combined with compact WBE and some other magical body construction technology a company comes up with a "transporter" for rapid trips to Mars. It's a commercial success. 18:17 < maaku> People walk in, "wake up" on Mars, spend a week long holiday, then come home via the reverse process. 18:17 < lsneff> That's analogous, but I don't think it's analogous enough. One can look at the history of labotomy or even perhaps the other patients in question and see that they may have lost aspects of their personality, lost memories, unable to think as well, death, etc. 18:18 < lsneff> As the same time, people take anti-depression pills that can sometimes dramatically change their personality, but they're considered the same individual 18:19 < docl> IMO there isn't enough room in the unknown regions to account for consciousness... it's so entwined with stuff we do understand like neurons that it would be hard for something like a soul (even a mortal one that dies with the neurons) to exist if we aren't actually talking about the patterns of the neurons. and I agree with lsneff that actual alterations to the brain and therefore the pattern is 18:19 < maaku> It's not meant to be analogous. It's meant to demonstrate that asking the patient after the fact is not a reliable means to discover what the patient prior to the experiment would want. 18:19 < lsneff> No, it certainly wouldn't sell me on the procedure because there's a lot of outside context that a lobotomy is a destructive process. 18:19 < docl> disanalogous in an important way 18:19 < lsneff> maaku: Ah, I see. Yes, I do agree with you there. 18:19 < maaku> docl: I'm not talking about new physics, but rather a mapping of consciousness to existing physics such that we understand the mechanisms of how it arises, and where it breaks 18:20 < maaku> see e.g. Max Tengmark's recent work 18:20 < maaku> lsneff: so to the Mars-transporter thing which is supposed to ba analogous... 18:20 < maaku> Because it is passive scanning, to achieve the no-clone safety they have a trap door that catches your original body and crushes it inter fertilizer. 18:21 < docl> well wireheading or something like a crack cocaine addiction might also be illustrative. happiness or satisfaction with the procedure isn't all we're looking for. you could imagine replacing the brain with a copy taken from happy, normal, completely different person, and we'd agree it's a different person then 18:21 < maaku> Only the day you walk though the trap door doesn't work. Someone comes to collect you after the procedure and takes you to the back office. He shows video of you on Mars and explains the mixup. 18:22 < maaku> He then says "we have a backup procedure for cases like this", then reaches into his desk drawer and pulls out a gun... 18:22 < maaku> lsneff: would you want to go along with this? 18:22 < maaku> he assures you it would be quick and painless, and the "you" on Mars would have no memory. 18:23 < lsneff> Obviously not. I would have diverged enough at that point to be a separate individual. 18:23 < docl> maaku: I think he addressed that earlier -- this procedure creates two branch copies that both share the same past self. so going in you have a 50% chance of being the guy who gets turned into fertilizer 18:23 < lsneff> If they could somehow merge our memories together, I would go along with that. 18:24 < maaku> Ok I at least understand the point of disagreement now. 18:24 < lsneff> Since they normally turn you into fertilizer instantly, there's normally no divergence. 18:24 < lsneff> Right 18:25 < maaku> From my own backing in physics, and the work done by others like Tegmark, it is my belief that all cases of destructive uploading are like this scenario, making it more instantanous or seamless doesn't change that fact 18:26 < maaku> In normal operation you'd be the crushed-into-fertilizer version 100% of the time 18:26 < docl> the guy on mars is just as firmly causally linked to the scanned self as the guy getting turned into fertilizer though, right? 18:27 < maaku> docl: causal linkage doesn't matter 18:27 < maaku> if you're a pregnant woman, should you expect a 50% chance of ending up as the baby? 18:27 < maaku> or a man at the point of conception, for that matter 18:28 < docl> I feel it's a necessary but not sufficient criteria 18:28 < maaku> True in a trivial sense 18:28 < maaku> (Good luck doing anything outside of your light cone) 18:29 < lsneff> maaku: If your memories of the last year were suddenly wiped, would you consider yourself the same person? 18:29 < docl> if I get in a rocket and go to mars, there's a causal link to the body that was at home, as well as a preserved information set and functionality... all these things seem to be what adds up to being the same person in a meaningful sense 18:30 < lsneff> or say, something less drastic: the last month 18:30 < fenn> should i have gotten out my existential crisis bingo card instead 18:31 < docl> the difference is how the information/functionality gets transmitted, the mechanism is radically different with scanning/transmission/emulation. but then the question is how much to care about the mechanism, which seems kind of subjective 18:31 < maaku> lsneff: I don't think the contents of memory has any bearing, so yes. I consider amnesiac people to be the same person for the purpsoes of this discussion. 18:31 < maaku> (brb in ~20 minutes) 18:31 < lsneff> Ok. 18:33 < lsneff> Say you were digitized in the ship-of-theseus manner, but through an error, your memories during the scanning process (assume it's a month long for the purpose of this discussion) were lost. You only remember being biological and then suddenly being digital. 18:33 < fenn> goes to kitchen to get mayonnaise, wonders what it was he came downstairs for, has existential crisis 18:39 < lsneff> What's the difference between this process and a kill-and-scan process functionally? 18:41 < docl> well there are some physical properties like inertia where if you replace the brain bit by bit you can identify it as a particular object along a smooth timeline... I'm just not sure that matters much 18:42 < lsneff> A brain is not really a single object. It's more like 10^25 different elementary particles. 18:43 < lsneff> You can consider a single object for convenience, but it doesn't even really have a smooth timeline or a single reference frame. 18:45 < docl> you can throw a brain across the room and it bounces off the wall (not advised, of course) 18:46 < lsneff> It'd probably splash into a bunch of different splotches. Brains are pretty soft 18:47 < docl> yeah but there are molecular bonds keeping it in one piece. that those can be broken with enough force isn't really the issue 18:50 < lsneff> Fair enough, I was being overly pedantic. 19:27 < maaku> lsneff: memory and personality have no consequence to what docl and I are gettng at 19:27 < maaku> you take a roofie and you won't remember a night, but you're still you 19:27 < maaku> you get drunk and your personality changes, but you're still you 19:28 < maaku> In the sense that I mean, the closest desription I've found is "expectation of experience" 19:28 < maaku> Meaning what I expect to experience into future 19:28 < maaku> This isn't really a defenition. We're still dancing around that because we don't have a good physical theory of this. But it points in the right direction. 19:29 < lsneff> I see. 19:29 < lsneff> You mentioned you've been following max tegmark's views on this. What does he say about it? 19:32 < lsneff> Could you expand on "memory and personality have no consequence to what docl and I are gettng at" as well? Saying that memory and personality are not relevant to deciding what counts as the same individual is getting pretty close to a "soul." 20:26 < maaku> lsneff: the key question is "after I go down for [cryonics/uploading/transporter] do I expect to 'wake up' on the other side?" 20:26 < maaku> in the same sense that I, in the morning, expect to experience eating dinner in the evening that same day 20:27 < maaku> or, do my experiences stop and I go into oblivion or whatever happens at physical death and some *other* entity with my memories is created? 20:28 < maaku> Maybe this pattern matches to "soul". I don't care. I am asking a physics-based question about computational processes. 20:30 < maaku> If I am labotomized, I expect to experience life post operation in the same way I would if I went in for knee surgery. I'd have a very different personality for sure but that's not the relevant criteria here. 20:31 < lsneff> Can you explain why you think an entity with the same memories/personality who responds the same way to input as the original body and thinks the same things is not just you? 20:31 < maaku> Linking conscious identity (as I am describing it here) to information identity is objectively false. It leads to inconsistent results. So there is a deeper issue about conscious identity that is not well understood yet. 20:32 < lsneff> What inconsistent results? 20:32 < lsneff> I agree that conscious identity is not well understood. 20:33 < maaku> lsneff: do some thought experiments on that. You do a destructive mind upload, so now there's a digital version of you. All okay so far. Now launch another instance. 20:33 < lsneff> Yes? 20:34 < maaku> Going into the uploading machine, which entity to you expect to end up as? 20:35 < lsneff> Initially, they'd be identical. They'd diverge from each other after experiencing different things and would be considered different individuals after some point 20:36 < maaku> never mind you're off in non-sequitur land again 20:36 < lsneff> I'd consider them to both be me. 20:36 < lsneff> The thing is, I don't think your question is well-formed. 20:36 < lsneff> I don't expect to be one or the other. 20:36 < maaku> so you expect to be dead? 20:37 < lsneff> I'd expect to wake up and one copy of me would be one fork and another copy of me would be the other fork. 20:38 < lsneff> Sure, I'd die in the same way that I go to sleep at night without expecting continuous consciousness through the night. 20:38 < maaku> you expect to die every night? 20:39 < lsneff> they call sleep the little death after all 20:39 < maaku> that's poetic and false 20:39 < maaku> your brain doesn't stop working when you're asleep 20:39 < maaku> don't confuse not forming memories with not being alive at all 20:39 < lsneff> That's certainly true. I'm not saying this well. Let me try to rephrase 20:48 < lsneff> I get destructively scanned and wake up after being uploaded. I can see another copy of my virtual body wake up on the other side of the room. I feel like lsneff. I talk to the other entity and it also says it feels like lsneff. Both entities lay equal claim to being lsneff. I can see why this would certainly be a confusing situation for anyone involved, but I don't see any logical inconsistencies following a physicalist understanding of the 20:48 < lsneff> brain. 20:49 < lsneff> I think this whole conversation is contingent on one thing: I think that an entity with the same memories and personality counts as me and you don't. 20:54 < maaku> In the non-destructive scan scenario, the way you phrased it you expect "another copy" to to wake up in the room and that "other entity" to think it's you, have your memories, etc. 20:54 < maaku> why did you phrae it that way? 20:54 < maaku> *phrase 20:54 < maaku> You didn't expect to be that other copy talking to the non-destructively scanned you. 20:56 < lsneff> I was phrasing it by following one branch of personal identity. If you went down the other branch, you’d phrase it from the perspective of the other entity. 20:57 < lsneff> I was assuming that both entities that woke up were uploaded. 21:01 < maaku> And if one wasn't uploaded? 21:02 < maaku> I mean the non-destructively scanned case, where there is just a single upload, and the "you" that went into the scanning machine 21:04 < lsneff> Then same thing. I’d wake up, look over, and see the original lsneff looking at me. And, I’d wake up, look over, and see the virtual or robot lsneff looking over at me. 21:04 < lsneff> I’m not both, more that both are me. 21:05 < lsneff> Or that both were what I am right now. 21:09 -!- Hooloovoo is now known as Hoolooboo 21:36 < maaku> lsneff: sorry, we're going in circles. I don't know how to get this point across. 21:37 < lsneff> Same, haha 21:44 < lsneff> This must be worse for you since I imagine you must think I’m completely insane whereas I just think you’re probably wrong. 21:48 < maaku> No I just think you haven't thought through all the consequences of this. 21:48 < maaku> And seem to be hung up on this having anything to do at all with memory or personality or other such things. 21:50 < maaku> Whereas it's really about Descartes cogito: "I think, therefore I am." and the effect physical processes have on that thinker continuing to exist or not. 21:51 < maaku> I wish there were better words in English (or any language) for describing this. 23:08 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has joined #hplusroadmap --- Log closed Tue Oct 26 00:00:14 2021