--- Log opened Sat Jun 20 00:00:55 2020 00:02 -!- darsie [~kvirc@84-114-73-160.cable.dynamic.surfer.at] has joined ##hplusroadmap 00:21 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 00:26 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 265 seconds] 00:37 -!- justanotheruser [~justanoth@unaffiliated/justanotheruser] has quit [Ping timeout: 244 seconds] 00:39 -!- justanotheruser [~justanoth@unaffiliated/justanotheruser] has joined ##hplusroadmap 00:44 -!- ffranr [~ffranr@62-64-228-232.dynamic.dial.as9105.com] has joined ##hplusroadmap 01:08 < nmz787> kanzure: I'd be up for seeing a human gene modified monkey be born 01:10 < nmz787> people (and their govts) somehow think "catch and release" fishing is acceptable, but not humankeys (or are they hunkeys? humkeys? humonkeys? honkeys? monkans? etc...) 01:14 < L29Ah> https://www.youtube.com/watch?v=JUuu_O9ciPk 01:35 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has joined ##hplusroadmap 01:53 < Urchin[emacs]> supposedly there was a case of human-chimpanzee interbreeding, but the baby was euthanized 01:58 -!- Cory [~Cory@unaffiliated/cory] has quit [Ping timeout: 260 seconds] 02:06 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has quit [Quit: My MacBook has gone to sleep. ZZZzzz…] 02:11 < archels> kanzure: any thoughts on remapping the delete key between right ctrl and right alt? 02:11 < archels> I've been running this way for about a week now, not sure it's a keeper 02:14 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has joined ##hplusroadmap 02:51 < fenn> L29Ah: 7 iq points is a lot for a population average 03:10 < L29Ah> nmz787: releasing humonkeys is clearly evil as it will deprive them of benefits of hassle-free living in captivity 03:10 < L29Ah> like, say, not exporting democracy to iraq is cruel to iraqi peeps 03:11 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 03:12 < L29Ah> otoh if you can speak and understand basic market economy, you can earn yourself a living being a walking wonder (or janitor at least) 03:14 < fenn> those monkeys ought to be grateful they have a job~ 03:15 < L29Ah> i understand those who decide to kill the humonkeys though 03:15 < fenn> pokemon got it right with mankey 03:16 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 258 seconds] 03:16 < L29Ah> the society then will have to deal with accepting/rejecting the basic rights of such an entity, and the question is complex in a lot of cultures 03:16 < L29Ah> and everything will get blamed on the scientists that let it happen 03:17 < L29Ah> as the decision won't be unanimous, like with other kinds of slavery or nationalism 03:18 < fenn> the mankeys will blame the people actually, you know, killing them 03:18 < fenn> the whole "woe is unto me, why was i ever created" thing is pretty rare 03:19 < L29Ah> nah, getting away with killing people is hard when you are an outcast 03:20 < L29Ah> if mankeys can make profit, they won't be killed but enslaved and deprived of culture, like other animals 03:21 < fenn> right 03:21 < fenn> and we have no coherent ethical framework with which to come to a consistent set of laws and norms 03:22 < fenn> mrdata thinks humans that are born insensitive to pain shouldn't be considered persons 03:23 < fenn> clearly they would disagree 03:23 < fenn> and the mankey opinion should count for something as well 03:26 < mrdata> how many of them exist 03:26 < mrdata> the human design confers the same privileges, so fenn's report of my belief is wrong 03:27 < mrdata> we were discussing AI 03:27 < fenn> "Congenital insensitivity to pain is found in Vittangi, a village in Kiruna Municipality in northern Sweden, where nearly 40 cases have been reported." 03:28 < mrdata> 40 cases of what population 03:28 < fenn> ~800 03:28 < mrdata> theyre still human and they get all the same human rights; our previous discussion was not singling out individuals of the same species 03:29 < fenn> why? 03:29 < mrdata> it's a philosophical matter of connectedness to the universe 03:30 < fenn> i have no idea what that is supposed to mean 03:30 < mrdata> those who disconnect are prone to commit monstrous acts of cruelty; a machine doesnt have agnecy 03:30 < mrdata> *agency 03:30 < mrdata> you have to connect to the universe adequately to gain agency, in my opinion 03:31 < mrdata> otherwise it's just a mechanism 03:31 < fenn> human are made out of matter 03:31 < mrdata> the philosophical relationship with the universe at large is the point 03:32 < fenn> is a person in a very convincing VR simulation still a person? 03:33 < fenn> let's say Alice can leave the simulation, but Bob cannot, and otherwise they're functionally identical 03:34 < mrdata> a disembodied AI is disconnected, except to the sim which is its world 03:34 < mrdata> agency within its world might mae sense 03:34 < mrdata> VR sim is a kind of heaven or hell 03:35 < mrdata> can the dead own property, conduct business, rewrite their wills? 03:36 < fenn> can and should are two different things 03:36 < fenn> "we have laws against that sort of thing" 03:37 < fenn> still doesn't resolve the moral issue 03:37 < mrdata> the disconnect is coded into the culture, 'you cant take it with you' 03:38 < mrdata> will you let them commit suicide 03:40 < mrdata> or is this a remnant of a dead loved one, to comfort the living 03:40 < fenn> people routinely set up trusts to continue enacting their agency long after they're gone. for example the nobel prize, carnegie library, and other philanthropic organizations. the degree to which the deceased is able to minutely specify what they want done and how is more an artifact of the legal system than a statement of morality 03:41 < fenn> a sufficiently detailed set of specifiations is indistinguishable from a human in a box 03:42 < mrdata> the trust is a mechanism 03:42 < mrdata> it doesnt rewrite its own will 03:42 < fenn> only because the legal system is such an awful programming language 03:43 < mrdata> now, if you put a disembodied AI in charge of a corporation, you have created a monster 03:43 < mrdata> i believe this has already happened 03:44 < fenn> a corporation is already an AI, it's just running on a substrate binders and minimum wage employees 03:44 < fenn> of* 03:44 < fenn> and there are evil humans as well 03:45 < fenn> all 20th century atrocities were deliberate 03:45 < fenn> and so on back into prehistory 03:46 < fenn> it's an amazing article of faith to believe that an AI would be worse 03:47 < fenn> anyway, better/worse good/evil seems to be missing the point 03:47 < fenn> i think snails have agency, but shouldn't have voting rights 03:48 < fenn> clearly there's some room for finer gradations 03:48 < fenn> at the same time, it's hard to build systems that aren't susceptible to bias 03:49 < fenn> traditionally we've gotten around this by implementing generic principles, in the hopes that these principles are less affected by biases 03:50 < fenn> but if you want to have an ethical framework that covers several orders of magnitude in any dimension, it's got to have finer granularity 03:51 < mrdata> your boss is a psychopath, sure 03:54 < fenn> "courts have long recognized that allowing owners to attach long-lasting contingencies to their property harms the ability of future generations to freely buy and sell the property, since few people would be willing to buy property that had unresolved issues regarding its ownership hanging over it" 03:54 < fenn> this argument doesn't apply to pure monetary currency 03:56 < mrdata> once you can restore senses to those with acquired brain injury, i will consider allowing mechisms to be deemed 'embodied'. in my view it's still about connectedness 03:56 < mrdata> *mechanisms 03:56 < mrdata> pain warns you that you are damaging yourself 03:57 < mrdata> i believe knowledge of pain is fundamental to agency 03:58 < mrdata> if you cannot take take of your own safety, are you not put in a home? 03:58 < mrdata> *take care of 04:02 < L29Ah> i hereby declare mrdata's philosophy bullshit, thanks for your attention 04:11 < mrdata> restore senses to those with acquired brain injury 04:11 < mrdata> or you are monsters 04:12 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 04:16 < juri_> History judges everyone to be monsters. I prefer to be one of the more enlightened monsters. 04:17 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 260 seconds] 04:29 -!- Llamamoe [~Llamagedd@188.146.96.127.nat.umts.dynamic.t-mobile.pl] has joined ##hplusroadmap 04:34 -!- ffranr [~ffranr@62-64-228-232.dynamic.dial.as9105.com] has quit [Ping timeout: 246 seconds] 04:44 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has quit [Quit: My MacBook has gone to sleep. ZZZzzz…] 04:54 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has joined ##hplusroadmap 05:27 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 05:31 -!- ffranr [~ffranr@62-64-228-232.dynamic.dial.as9105.com] has joined ##hplusroadmap 05:34 < TMA> I would prefer success to enlightenment, but neither can be obtained. A backwards monster loser it is for everyone then. 05:42 < TMA> fenn: if you say the atrocities be deliberate, you are not entirely correct. not even mostly correct. either the atrocities were inflicted without paying attention (as in "no humans were harmed -- those are subhumans and therefore have no moral weight") or they were "necessary collateral damage" or not even expected 05:43 < TMA> there are very few people that are knowingly evil. 05:46 < TMA> even the two genocides of the 20th century were more of an instrumental goal than anything else. The Armenians and the Jews were just an obstacle to be overcome in the pursuit of a Better World 05:47 < TMA> (or _THE_ Better World to the perpetrators) 05:49 < TMA> mrdata: the trust can rewrite its will by interacting with its environment in a way, that provokes the environment to override it 05:51 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has quit [Quit: My MacBook has gone to sleep. ZZZzzz…] 05:57 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 240 seconds] 05:59 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has joined ##hplusroadmap 06:01 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 06:02 < mrdata> TMA, it's a mechanism 06:03 < mrdata> only individuals with agency can take legal responsibility for their actions 06:04 < mrdata> you may have agents who work for a trust company; those indivduals are responsible to execute the agreements 06:09 < mrdata> a judge can set aside a provision of a will that is illegal 06:10 < mrdata> but it isnt the trust rewriting itself 06:30 -!- ffranr [~ffranr@62-64-228-232.dynamic.dial.as9105.com] has quit [Ping timeout: 256 seconds] 06:35 < TMA> mrdata: agency is a fiction I no longer subscribe to. 06:36 < mrdata> well then rule of law disintegrates, too 06:38 < TMA> why should that be causally connected? those upholding it have no agency to decide to stop upholding it 06:38 < mrdata> why dont you just abuse children then 06:39 < mrdata> because you have legal responsibility for your actions 06:39 < TMA> you cannot decide to stop believing in the existence of agency -- because to decide so you would need the agency to do so 06:39 < mrdata> call it a fiction, then go to jail in my opinion 06:39 < mrdata> youre too dangeruos to be set loose on society 06:40 < TMA> I didn't do it before I lost my belief in agency. I have not encountered circumstances to nudge my behavior in that direction. Why should it change? 06:41 < mrdata> there is no scientific basis for your opinion 06:41 < TMA> A change needs an impetus. 06:41 < TMA> mrdata: that's not true 06:41 < mrdata> you just want to escape responsibility for your actions 06:42 < TMA> mrdata: there were experiments that established that given sufficiently detailed brain imaging, you can predict people's decisions prior to them thinking they have made it 06:43 < TMA> mrdata: I do not want to escape responsibility. responsibility is a social construct -- it is the opinion of people regarding an action 06:44 < TMA> mrdata: as the people have no agency, they have no impetus to change that opinion :) 06:46 < TMA> mrdata: not even I can. I cannot even choose what I am saying. I can feel the illusion I can, but in reality I cannot. I cannot stop being aware of that fact. 06:47 < mrdata> predicting people's decisions prior to them thinking they have made it is a marvelous game that doesnt bring evidence to bear on agency at all 06:48 < mrdata> might be good for sales, tho 06:49 < TMA> really? if you do have an agency, you are able to act on your own will. if you act on a decision before you make it, did you really make that decision? (no, you have not) 06:49 -!- CryptoDavid [uid14990@gateway/web/irccloud.com/x-ucvmtauxpmimdcog] has joined ##hplusroadmap 06:49 < mrdata> youre gaming the idea of when the decision was made 06:50 < mrdata> and trying to say this wsnt making a decision 06:50 < mrdata> yet it is 06:52 < TMA> mrdata: pick a dice and throw it. observe it land showing say a three. If you "decide" after the observation that you will have had thrown a three, does it mean, that the three was cast _because_ of that decision? After observing the three, can you "decide" the dice land on four instead? 06:52 < TMA> (not as a self-deception trick, but a reality altering decision) 06:53 < TMA> free will is the sensation you have several miliseconds after the 06:53 < TMA> "decision" was already made 06:54 < mrdata> doesnt impress me 06:54 < TMA> whatever 06:54 < TMA> you do not get to choose what impresses you after all 06:55 < mrdata> hahahah 06:55 < TMA> it is futile for me to try 06:55 < TMA> but I cannot chose otherwise either 06:56 < juri_> wow. you two should chill. 07:01 < TMA> mrdata: please do not lose the illusion of agency. what you have said earlier the illusion of agency and the belief that people not subscribing to it are dangerous are the only barrier that holds the rest of your brain from commiting atrocities 07:02 < mrdata> no, i am embodied and feel pain 07:02 < mrdata> i can sympathize with suffering 07:02 < mrdata> that is also a barrier 07:03 < mrdata> empathize, even 07:03 < TMA> in that you are sadly in agreement with the religious, who believe (based on their self-perception) that only the fear of the divine prevents people to refrain from comitting atrocities (to a degree, mind you) 07:03 < mrdata> a mechanism can mimic this well enough to give people emotional attachment sometimes, (tamagoochi) 07:04 < mrdata> it isnt about fear 07:05 < mrdata> do you stop yourself from doing something because you'd be punished? or do you stop because it is wrong to do it. the former is a more childish view 07:06 < TMA> I have no agency to start doing anything -- likewise, I have no agency to stop 07:06 < mrdata> juri_, i'm feeling very chill thanks; hope you are well 07:08 < TMA> being aware that my skin is not green does not change the color of my skin; being aware of the nonexistence of agency does not change the behaviour either 07:09 < mrdata> the claim that you can predict my decisions gets elided to the idea that the future is pre-determined. it strongly isnt. 07:10 < TMA> watching a movie unfold and watching your life unfold is not a difference in kind, it is a difference in the level of detail 07:12 < TMA> mrdata: the future _is_ predetermined in a manner of speaking. however there are hard limits on the predictability, because there are hard limits on observability 07:12 < mrdata> proved false 07:12 < TMA> you cannot have determinism _and_ locality of description at the same time 07:12 < mrdata> by the science 07:13 < mrdata> future is not written, marty 07:13 < TMA> but you can lose one or the other: both nonlocal deterministic and local deterministic is possible 07:13 < TMA> but you can lose one or the other: both nonlocal deterministic and local _nondeterministic_ is possible 07:14 < mrdata> if you could replay an event, you could decide to play it differently 07:14 < mrdata> and fundamentally the underlying reality may force that 07:16 < TMA> you cannot replay an event, because you cannot _observe_ the initial conditions 07:16 < TMA> and that 07:17 < TMA> and that's because the observation is local and therefore necessarily non-deterministic 07:18 < mrdata> so you believe your awareness is trapped in a local bubble; fair, i suppose most people do 07:18 < mrdata> but i tire of this discussion 07:19 < TMA> on the other hand, it rarely matters with respect to human behaviour -- there are too many particles in human-environment interaction for quantum effects and heisenberg uncertainity not to be averaged away 07:21 < TMA> and as a summary, a dilbert comic: https://dilbert.com/strip/2015-05-24 07:21 < mrdata> chaos, ie sensitive dependence on initial conditions, functions in the brain 07:22 < mrdata> this links it to the quantum firmly 07:22 < mrdata> whereas you can stop and restart a computer simulation 07:23 < mrdata> running it on the same inputs and producing the same result 07:24 < TMA> a piece of cheese is an analog computer simulating a piece of cheese 07:25 < TMA> you can restart a sufficiently small digital simulation 07:26 < TMA> but the results will be exactly the same only if (a) the inputs are exactly the same and (b) the inputs are known (a random bit-flip partway down the computation is an input too) 07:29 * mrdata yawns 07:30 < TMA> rainy saturday afternoon supports idle philosophying, I guess 07:31 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 256 seconds] 07:32 < TMA> back to start: as humans have no agency, a trust needs it neither. in both cases any changes can be equally ascribed to their purported "wills" 07:34 < mrdata> i dont doubt that many people will get this wrong, to their peril 07:35 < TMA> (for will as well as agency is just a convention held in human heads having no reality apart, no external "it" to exist in absence of the people's beliefs) 07:38 < TMA> all in all, nothing matters. some particle assemblies will "disagree" ;-) 07:38 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 08:05 -!- yashgaroth [~ffffffff@172.58.22.143] has joined ##hplusroadmap 08:09 -!- sknebel_ [~quassel@v22016013254630973.happysrv.de] has joined ##hplusroadmap 08:10 -!- sknebel [~quassel@v22016013254630973.happysrv.de] has quit [Ping timeout: 272 seconds] 08:23 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has quit [Quit: My MacBook has gone to sleep. ZZZzzz…] 08:27 -!- sknebel_ is now known as sknebel 08:36 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has joined ##hplusroadmap 08:47 < gnusha> https://secure.diyhpl.us/cgit/diyhpluswiki/commit/?id=9f264eda Michael Folkson: Add transcript for Socratic Seminar on BIP Schnorr >> http://diyhpl.us/diyhpluswiki/transcripts/london-bitcoin-devs/2020-06-16-socratic-seminar-bip-schnorr/ 08:47 < gnusha> https://secure.diyhpl.us/cgit/diyhpluswiki/commit/?id=3ebfd812 Bryan Bishop: Merge pull request #118 from michaelfolkson/add-socratic-schnorr >> http://diyhpl.us/diyhpluswiki/ 09:06 -!- Urchin[emacs] [~user@unaffiliated/urchin] has quit [Ping timeout: 244 seconds] 09:21 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 240 seconds] 09:41 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 09:46 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 265 seconds] 11:41 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 11:46 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 240 seconds] 12:03 < L29Ah> 17:32:57] back to start: as humans have no agency, a trust needs it neither. in both cases any changes can be equally ascribed to their purported "wills" 12:03 < L29Ah> game theory builds trust w/o any agency or consciousness bullshit 12:24 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 12:32 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 246 seconds] 12:53 < fltrz> an economic argument against gain of function research: cost of gain of function research + subsequent cost of making a preventative vaccine (for humans) + cost of risk of release > cost of designing a vaccine for the animal reservoir (for bats or whatever supposed threat) 12:53 < fltrz> hence gain of function research is just bio warfare in disguise 12:54 < fltrz> the argument assumes making vaccine for bats is as expensive as making vaccine for humans, so subtracting from both sides: cost of gain of function research + cost of risk of accidental release 12:54 < fltrz> > 0 12:58 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has quit [Remote host closed the connection] 13:16 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has joined ##hplusroadmap 13:58 -!- ffranr [~ffranr@62-64-230-206.dynamic.dial.as9105.com] has joined ##hplusroadmap 14:02 -!- Codaraxis_ [~Codaraxis@ip68-5-90-227.oc.oc.cox.net] has joined ##hplusroadmap 14:05 -!- Codaraxis__ [Codaraxis@gateway/vpn/mullvad/codaraxis] has quit [Ping timeout: 260 seconds] 14:05 -!- Codaraxis__ [~Codaraxis@ip68-5-90-227.oc.oc.cox.net] has joined ##hplusroadmap 14:06 -!- Llamamoe [~Llamagedd@188.146.96.127.nat.umts.dynamic.t-mobile.pl] has quit [Quit: Leaving.] 14:06 -!- Codaraxis_ [~Codaraxis@ip68-5-90-227.oc.oc.cox.net] has quit [Ping timeout: 258 seconds] 14:23 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has quit [Quit: My MacBook has gone to sleep. ZZZzzz…] 14:29 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 14:34 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 258 seconds] 14:39 -!- Urchin[emacs] [~user@unaffiliated/urchin] has joined ##hplusroadmap 14:45 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 14:50 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 240 seconds] 14:55 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 15:14 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 246 seconds] 15:15 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 15:44 -!- haxkers [2734c0bc@39.52.192.188] has joined ##hplusroadmap 15:52 -!- darsie [~kvirc@84-114-73-160.cable.dynamic.surfer.at] has quit [Ping timeout: 260 seconds] 16:08 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has joined ##hplusroadmap 16:21 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has quit [Quit: My MacBook has gone to sleep. ZZZzzz…] 16:24 -!- haxkers [2734c0bc@39.52.192.188] has left ##hplusroadmap [] 17:00 < fenn> it seems obvious to me that gain of function research is academic cover for biowarfare programs 17:01 < fenn> the biowarfare researchers still have to interface with academia in order to keep pace 17:02 < fenn> as we've seen, "less lethal" biowarfare can be a thing 17:02 < L29Ah> in fact animals vaccines are cheaper to develop as you have unlimited disposable hosts 17:03 -!- justan0theruser [~justanoth@unaffiliated/justanotheruser] has joined ##hplusroadmap 17:05 -!- justanotheruser [~justanoth@unaffiliated/justanotheruser] has quit [Ping timeout: 260 seconds] 17:14 -!- ffranr [~ffranr@62-64-230-206.dynamic.dial.as9105.com] has quit [Ping timeout: 264 seconds] 17:42 -!- shawwwn [uid6132@gateway/web/irccloud.com/x-uwxxhpognhuhixep] has joined ##hplusroadmap 18:06 < yashgaroth> how the fuck do you vaccinate enough wild animals to achieve herd immunity 18:07 < yashgaroth> also yeah biowarfare research will continue regardless of any ban 18:56 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has joined ##hplusroadmap 19:08 -!- dr-orlovsky [~dr-orlovs@xdsl-188-154-186-21.adslplus.ch] has quit [Quit: My MacBook has gone to sleep. ZZZzzz…] 20:22 -!- shawwwn [uid6132@gateway/web/irccloud.com/x-uwxxhpognhuhixep] has quit [Quit: Connection closed for inactivity] 20:30 -!- yashgaroth [~ffffffff@172.58.22.143] has quit [Quit: Leaving] 20:50 < nmz787> .tell yashgaroth: salt licks 20:50 < saxo> (.to) 20:50 < nmz787> .to yashgaroth: salt licks 20:50 < saxo> Sorry, 'yashgaroth:' is not a valid nickname 20:50 < nmz787> saxo: yes it is 20:50 < nmz787> who is responsible for the bot these days? 21:29 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has quit [Ping timeout: 256 seconds] 21:37 -!- aeiousomething [~aeiousome@unaffiliated/aeiousomething] has joined ##hplusroadmap 22:08 -!- darsie [~kvirc@84-114-73-160.cable.dynamic.surfer.at] has joined ##hplusroadmap 23:08 -!- Codaraxis_ [Codaraxis@gateway/vpn/mullvad/codaraxis] has joined ##hplusroadmap 23:12 -!- Codaraxis__ [~Codaraxis@ip68-5-90-227.oc.oc.cox.net] has quit [Ping timeout: 264 seconds] 23:35 < L29Ah> .to yashgaroth salt licks 23:35 < saxo> Okay, I'll pass that message along to yashgaroth 23:35 < L29Ah> :P 23:38 -!- CryptoDavid [uid14990@gateway/web/irccloud.com/x-ucvmtauxpmimdcog] has quit [Quit: Connection closed for inactivity] --- Log closed Sun Jun 21 00:00:56 2020