--- Log opened Sat Aug 15 00:00:41 2015 00:14 -!- nbsp [~g@nv-67-77-147-65.dyn.embarqhsd.net] has joined ##hplusroadmap 00:14 -!- nbsp [~g@nv-67-77-147-65.dyn.embarqhsd.net] has left ##hplusroadmap [] 00:29 -!- Viper168 [~Viper@unaffiliated/viper168] has quit [Quit: Leaving] 00:32 -!- Viper168 [~Viper@unaffiliated/viper168] has joined ##hplusroadmap 01:58 -!- justanotheruser [~Justan@unaffiliated/justanotheruser] has quit [Read error: Connection reset by peer] 02:02 -!- justanotheruser [~Justan@unaffiliated/justanotheruser] has joined ##hplusroadmap 02:25 -!- sheena [~home@S0106c8be196316d1.ok.shawcable.net] has quit [Ping timeout: 240 seconds] 02:43 -!- augur [~augur@c-73-46-94-9.hsd1.fl.comcast.net] has quit [Ping timeout: 250 seconds] 02:46 -!- augur [~augur@c-73-46-94-9.hsd1.fl.comcast.net] has joined ##hplusroadmap 03:27 -!- sandeep_ [~sandeep@111.235.64.135] has quit [Remote host closed the connection] 03:56 -!- AmbulatoryCortex [~Ambulator@173-31-155-69.client.mchsi.com] has joined ##hplusroadmap 04:17 -!- joshcryer [~g@unaffiliated/joshcryer] has quit [] 04:41 < kanzure> .title 04:41 < yoleaux> Modified yeast produce opiates from sugar 04:43 < justanotheruser> kanzure: how do I increase efficiency further 04:43 < justanotheruser> you seem to be in the top 2 most efficient people I know 04:46 < kanzure> have a crippling sense of workaholicism 04:46 < kanzure> "Some synthesizers are more susceptible to humidity than others. In extreme situations, customers have gone so far as to make a ‘tent’ of non-static plastic sheeting around the synthesizer and placed a dehumidifer inside. The increase in coupling efficiency was dramatic." 04:49 < kanzure> justanotheruser: i was brainwashed by an internet cult as a young child, the cult was focused on behavior engineering and productivity informed by reason and logic but really there was just lots of shaming and peer pressure. so try some of that. 04:53 < kanzure> they were focused on tihngs like, out of all the possible behaviors that any of us could generate, which ones should we pick or exclude if any 05:12 -!- CyberelfJess [~CyberelfJ@ip565f6f48.direct-adsl.nl] has joined ##hplusroadmap 05:24 < kanzure> hmm looks like i have most of those old emails 05:24 < kanzure> there was a city layout that used lots of hexagons. huh. 05:28 -!- Madplatypus [uid19957@gateway/web/irccloud.com/x-hhzszmmyihvvsbyw] has quit [Quit: Connection closed for inactivity] 05:30 < kanzure> we were making "hulks" of productivity. it was glorious. 05:31 < kanzure> artist's interpretation http://nakamotoinstitute.org/static/img/mempool/why-bitcoin-will-continue-to-grow/hulk.jpg 05:34 < EnLilaSko> So the answer is a time machine so you can go back to when you were a kid and get brainwashed 05:35 < kanzure> yup 05:35 < kanzure> perhaps not 05:36 < kanzure> although it might explain why i haven't been having good results with anyone older 05:39 -!- nsh [~lol@wikipedia/nsh] has quit [Excess Flood] 05:39 -!- nsh [~lol@wikipedia/nsh] has joined ##hplusroadmap 05:55 < justanotheruser> kanzure: what internet cult 05:56 < kanzure> well it was a small operation, you wouldn't have heard of it 05:56 < justanotheruser> hipster 05:56 < kanzure> i was just looking at some old logs 05:57 < kanzure> this was our glorious leader in 2005ish http://diyhpl.us/~bryan/calxism-archives/chats/biomors_rant_mit.html 05:57 < kanzure> looking back i'm starting to wonder if the point of the cult was to implement an opengl es pipeline for him... 05:58 < justanotheruser> hmm, cprogramming? gamedev? 05:58 -!- nsh [~lol@wikipedia/nsh] has quit [Excess Flood] 05:58 < kanzure> yeah, one of our projects was an mmorpg engine that we went on to sell 05:58 -!- nsh [~lol@wikipedia/nsh] has joined ##hplusroadmap 05:58 < justanotheruser> oh really 05:59 < justanotheruser> are those pidgin logs? 05:59 < kanzure> back then it was called gaim :-) 05:59 < justanotheruser> pidgin sucks :( 06:00 < justanotheruser> I know this because I still use it for outdate messaging protocols 06:00 < justanotheruser> the usual reaction I get when someone knows I have AIM is something something middle school 06:00 < kanzure> well, i'ts not helpful that every chat service has closed up- no more xmpp on gchat, facebook, etc. 06:00 < kanzure> a friend of mine invented the aim subprofile, true story 06:00 -!- c0rw|zZz is now known as c0rw1n 06:01 < justanotheruser> hmm, no idea what that is, must be before my time 06:03 < kanzure> it was the ability to click links inside of profiles on aim 06:03 < kanzure> you know.. like where you kept all the hilarious AIM quotes. 06:05 < justanotheruser> oh right, I remember being hilarious in middle school 06:06 < kanzure> indeed 06:27 < kanzure> justanotheruser: another thing that helps is picking projects 06:28 -!- nsh [~lol@wikipedia/nsh] has quit [Excess Flood] 06:28 -!- nsh [~lol@wikipedia/nsh] has joined ##hplusroadmap 06:33 -!- nsh [~lol@wikipedia/nsh] has quit [Excess Flood] 06:34 -!- nsh [~lol@wikipedia/nsh] has joined ##hplusroadmap 06:47 < justanotheruser> kanzure: what does that mean 06:49 < kanzure> working on ambitious projects is a helpful way of working on... oh. 06:49 < kanzure> well i guess it's a tautology. 06:52 -!- wrldpc1 [~ben@hccd37dda2b.bai.ne.jp] has joined ##hplusroadmap 07:00 < justanotheruser> yes, I have a project and I think I am in the process of turning into a workaholic, I am just wondering how to make my working hours more efficient 07:00 -!- CheckDavid [uid14990@gateway/web/irccloud.com/x-tfpndrdhyunawnsb] has joined ##hplusroadmap 07:00 < kanzure> justanotheruser: peer pressure might work. i could yell at you for hours if you want? 07:01 < justanotheruser> maybe you could record yourself doing it for a few minutes and I could play it over and over while I sleep 07:01 < kanzure> well that's certainly one idea 07:02 < archels> where do I sign up for this 07:02 < kanzure> "if you don't get this done everyone is going to die" 07:02 -!- sandeep [~sandeep@111.235.64.135] has joined ##hplusroadmap 07:03 < archels> having a girlfriend doesn't help with the whole workaholic thing, let me tell you that 07:03 < kanzure> "an ai in the future is going to judge you for your incompetence and will time travel to.." well i forget how that one goes. 07:03 -!- sandeep [~sandeep@111.235.64.135] has quit [Read error: Connection reset by peer] 07:03 < kanzure> archels: perhaps you just need a workaholic girlfriend 07:03 -!- sandeep [~sandeep@111.235.64.135] has joined ##hplusroadmap 07:03 < archels> kanzure: the AI will torture emulations of you till eternity 07:03 < kanzure> no, it's emulations of your friends 07:03 -!- seanph [~seanph@98.126.7.242] has joined ##hplusroadmap 07:03 -!- sandeep is now known as Guest92884 07:04 < kanzure> greetings seanph 07:04 -!- Guest92884 [~sandeep@111.235.64.135] has quit [Read error: Connection reset by peer] 07:04 < archels> that might be more effective for emphatically inclined people, sure 07:04 < seanph> hey Bryan 07:04 < archels> empathically, rather 07:04 -!- sandeep_ [~sandeep@111.235.64.135] has joined ##hplusroadmap 07:04 < justanotheruser> archels: yep, eunuchs are the most efficient transhumanists 07:04 < kanzure> seanph: so this is the dna synthesizer crew, among other things 07:05 * archels involuntarily crosses his legs 07:05 < kanzure> justanotheruser: do you actually know any eunuchs? 07:05 < archels> admittedly I've given serious thought to castration for longevity reasons 07:05 < justanotheruser> 1) become eunuch, 2) solve transhumanism, 3) reverse age and eunuchism 07:06 < justanotheruser> kanzure: no, but they live longer and are more logical 07:06 < kanzure> you don't need germline cells to reproduce anymore, so castration doesn't sound so bad 07:06 < archels> doesn't seem to be much correlational evidence though 07:07 < kanzure> oh i thought there was evidence ? 07:07 < archels> some, scant, as far as I remember 07:08 < kanzure> someone should look into that 07:08 < fenn> it's only like 20% at best 07:08 < fenn> work much better in worms 07:08 < AmbulatoryCortex> My wife would be rather upset with me if I became a eunuch. 07:09 < seanph> AmbulatoryCortex: +1 07:09 < archels> which further proves the theory that transhumanism and girlfriends/wives do not a good combination make 07:09 < kanzure> you can inject dna into females through other means (or sperm) 07:10 < AmbulatoryCortex> kanzure, the sperm part is about to be remedied shortly anyway 07:10 < AmbulatoryCortex> my fertility isn't the problem :P 07:10 < fenn> kanzure's productivity is due to his massive amphetamine use, not because he doesn't have a girlfriend 07:10 < kanzure> i was putting together a document yesterday about how cheap it is to get a surrogate pregnancy + in vitro fertilization + donor sperm or donor eggs or converting your skin cells to stem cells or other reproductive material. 07:10 < kanzure> actually i do have a girlfriend at the moment 07:10 < fenn> well, pretend i used the subjunctive tense then 07:11 < justanotheruser> oh damn 07:11 < seanph> ah man, living in China I miss amphetamines 07:11 < seanph> so jelly 07:11 < kanzure> seanph: adderall is the only reaosn why i have non-nil working memory... 07:11 < kanzure> *reason 07:11 < seanph> haha 07:11 < archels> kanzure: someone from the internet? I can see how that might work out if she's also a workaholic. 07:11 < kanzure> yeah, i selected her because of her workaholic tendencies 07:11 < kanzure> it's pretty great 07:12 < justanotheruser> kanzure: you're here 17 hours/day, is the time you spend with your gf spent while multitasking on IRC? 07:12 < fenn> if you hire a surrogate to get pregnant with donated sperm, whose child is it? 07:12 -!- punsieve [~drandomtu@2601:185:8001:fcb0:8d64:18db:270d:5262] has joined ##hplusroadmap 07:12 < kanzure> fenn: well, i think surrogates often refers to "in vitro fertilization is mandatory" 07:13 < fenn> i mean some sperm from a sperm bank 07:13 < kanzure> then it wouldn't be you (assuming you're a male) 07:13 < kanzure> seanph: so what did you think of the dna synthesis documents? 07:13 < AmbulatoryCortex> yeah, whose child is it if you get a doner egg and sperm, and have a surrogate carry the baby? 07:14 < justanotheruser> its whoever the state declares the owner to be 07:14 < kanzure> and then put the kid up for adoption 07:14 < seanph> kanzure: honestly I don't know enough to judge them. they look legit / serious and above my head scientifically 07:14 < AmbulatoryCortex> kanzure, heh 07:14 < seanph> kanzure: I have a lot of study to do before getting seriously into this stuff 07:14 < kanzure> seanph: well, the basic idea is to use an inkjet printhead to do a few million spots per second, use droplts of the reagents to perform separate reactions in each separate spot or dot on the surface 07:14 < kanzure> *droplets 07:15 < seanph> kanzure: I can certainly understand the machines and electronics, just not what/why they are doing what they are doing 07:15 < seanph> kanzure: Yeah, that makes sense. inkjets are old tech tho - why is this something new? 07:15 < kanzure> "why does the industry suck" you mean? 07:15 < seanph> hehe 07:15 < seanph> it certainly does have a higher barrier to entry than many others 07:16 < kanzure> well also lots of people perceive it as difficult 07:16 < kanzure> i mean the original dna synthesis tech won a nobel prize in the 60s 07:16 < kanzure> and then that person went on to lead the group at a company to make an automated machine 07:16 < kanzure> and then there were lots of patents for 20-30 years that prevented anyone from doing anything as a company 07:16 < justanotheruser> what is difficult about the inkjet synthesis? Are these common? 07:16 < kanzure> this is how most dna synthesizers are designed: https://www.takeitapart.com/guide/94 07:17 < seanph> kanzure: Actually looking through that now. pictures were slow to load due to communism 07:17 < kanzure> well imagine a beige xerox machine, 18 bottles on the front, pneumatic system to push chemical reagents around with argon 07:18 < fenn> DNA XEROX to complement your DNA INKJET and DNA LASER PRINTER 07:18 < kanzure> i'm pretty sure xerox machines pioneered that biege office look 07:18 < fenn> oh and DNA STAPLES for DNA ORIGAMI 07:19 < seanph> doesn't illustrate that clearly what the machine actually does - but you are making it sound like all you need to do is put droplets on top of one another 07:19 < fenn> (why does origami need staples??) 07:19 < fenn> yep most of the magic is just putting droplets on top of each other 07:19 < seanph> and I guess that basically makes sense 07:19 < kanzure> most of the machines on the market use a single "column" that they pump liquids and reagents through 07:20 < justanotheruser> Are inkjets used for other syntheses? 07:20 < kanzure> so you hvae to route the liquids to that column and then apply pressure 07:20 < gradstudentbot> Apparently my PI got this grant back in 1961. I think ARPA has forgotten about the lab and everything. 07:20 < kanzure> and you can only have one unique dna molecule that you are synthsizing per column. (but lots of copies of that molecule are constructed simultaneously in the column) 07:20 < seanph> I guess you have a droplet of, say, adenine, and then you have some enzyme, and then something else to eliminate the remaining adenine? 07:20 < kanzure> (like micrograms or milligrams of the compound) (whereas the inkjet dna synthesizer makes substantially less per spot) 07:21 < kanzure> well it's purely a chemical synthesis actually 07:21 < kanzure> so you use phosphoramidites that are mimics of adenine 07:21 < kanzure> and there's an enzyme but it's not a biological enzyme, it's a chemical enzyme activator heh 07:21 < kanzure> i mean i would certainly prefer a purely-biological approach to dna synthesis, but nobody has figured that out yet really 07:22 < seanph> all I mean is, I can see how you would do it that way 07:22 < fenn> catalyst, not enzyme 07:22 < seanph> as I said, I don't know this topic 07:22 < kanzure> catalyst, yes, sorry 07:23 < kanzure> our design is missing some steps regarding what to do after you print the oligos on a surface 07:24 < kanzure> phosphoramidite chemistry to synthesize oligonucleotides/dna only works for dna molecules of length 20 to 100 (20 to 100 base pairs (bp)) depending on reaction conditions, humidity, etc... 07:24 < kanzure> so you have to ligate (combine) the molecules together if you want to make single dna molecules that specify a protein or multiple proteins.. 07:28 < kanzure> justanotheruser: yes inkjets are sometimes used for other syntheses 07:29 < seanph> I've definitely heard of inkjets used in chemistry 07:29 < kanzure> but that's more of a question for CaptHindsight 07:29 < seanph> just out of curiosity, have you guys seen acoustic levitation for chemistry? inkjets are often used in that 07:29 < kanzure> well we have seen acoustic levitation of small objects 07:30 < kanzure> and acoustic cavitation in microfluidic channels... or ultrasonic pumping by cavitation, etc. 07:30 < seanph> a practical use of the levitation is in chemistry - you get isolated droplets 07:30 < seanph> can add reagents to them, can move them around 07:30 < seanph> probably even combine them 07:31 < kanzure> fenn prefers that technique to the inkjet approach i think 07:31 < kanzure> there is a technique called "EWOD" or electrowetting on dielectric 07:31 < seanph> well, inkjets are commonly used to add the reagens 07:31 < gradstudentbot> Uh, interesting question. 07:31 < kanzure> where you put a droplet on a superhydrophobic surface 07:31 < kanzure> and then you have an array of TFT elements underneath which causes the droplets to move 07:32 < kanzure> https://www.youtube.com/watch?v=LzbFPxWd2s4 07:32 < kanzure> .title 07:32 < yoleaux> DMV Case Study 4 - Electrowetting - YouTube 07:32 < kanzure> er this one seems to be using gold electrodes 07:32 < seanph> wow.. that is really cool 07:32 < seanph> I want to make one :-D 07:32 < seanph> what is the scale here? 07:33 < kanzure> i believe that was macroscopic 07:33 < kanzure> better example https://www.youtube.com/watch?v=k9YE4jf-wzo 07:35 -!- Merovoth [~Merovoth@unaffiliated/merovoth] has quit [] 07:35 < seanph> what is the scale here? 07:35 < fenn> SCOEW is better than EWOD 07:35 < kanzure> in the last video there was a probe tip- i assume the droplet is at least 1 mm diameter 07:35 < fenn> scale varies depending on the setup, could be anywhere from 1 micron to 1 mm 07:36 < kanzure> i have not seen a 1 micron SCOEW setup 07:36 < seanph> wow https://www.youtube.com/watch?v=JvDZh8hmR84 07:36 < kanzure> .title 07:36 < yoleaux> DNA Lab on a Chip - YouTube 07:37 < kanzure> seanph: also you can move droplets with lasers if you put the droplet on a photoconductive surface (this video is longer and is probably incompatible with communism) https://www.youtube.com/watch?v=8PeYwGDnt7I 07:37 < kanzure> the problem with microfluidics is that it's much harder to debug 07:37 < kanzure> and valves are a pain in the ass 07:37 < kanzure> moving droplets on a surface removes the problem with valves, at least 07:38 < seanph> it seems like what you are trying to do here applies to a lot more than DNA synthesis 07:39 < kanzure> (most microfluidic projects that require valves use pneumatic valves where the air or gas is in a cross-channel, so that the channel underneath or above can be "pinched" when you increase the pressure) 07:39 < seanph> it's just how to do chemistry automatically, on a small scale 07:39 < fenn> doesn't have to be a laser, you can move droplets with an LCD screen layered with special materials 07:39 < seanph> here's a pretty macro example :-p https://www.youtube.com/watch?v=C677yPYXWIs 07:39 < kanzure> yes but how many droplets per LCD screen.. it's not a lot. 07:40 < kanzure> yes that is the project from gaudi, which is open-source 07:40 < kanzure> i believe hackteria was involved in that project 07:40 < justanotheruser> seanph: neat 07:41 < justanotheruser> no info from the description though 07:41 < kanzure> it's in the logs.. one sec. 07:41 < kanzure> http://hackteria.org/wiki/Elektrowetting 07:42 < fenn> "SCOEW overcomes the size limitation of physical pixilated electrodes by utilizing dynamic and reconfigurable optical patterns and enables the continuous transport, splitting, merging, and mixing of droplets with volumes ranging from 50 microL to 250 pL," 07:42 < seanph> kanzure: Very cool.. definitely on my TODO list to try 07:43 < kanzure> yes but what density of 250 pL droplets 07:43 < kanzure> not sure if density is the measurement i want 07:43 < kanzure> number of droplets per cm^2 of SCOEW 07:44 < fenn> less droplets than pixels but i don't think it's more than an order of magnitude less 07:44 < kanzure> also you have to leave room for routing/movement/paths 07:44 < fenn> of course 07:44 < kanzure> and.... wash steps. 07:44 < fenn> whine whine 07:44 < kanzure> how are you going to do chemistry if you pollute every droplet 07:45 < seanph> magic 07:45 < kanzure> k 07:46 < seanph> oops - apple+f is not find in this app :-p 07:48 < fenn> a 250 picoliter droplet is 50 micron diameter 07:49 < fenn> an ipad-mini lcd display has 80 micron pixels 07:49 < kanzure> i think you need at least 50 micron diameters worth of pixels, plus edges plus extra room (because you have to animate the pixels to move the droplets) 07:49 < fenn> yes you need at least 9 pixels per droplet maybe more 07:50 < kanzure> if you look at the SCOEW video you can count the pixels used in those animations 07:50 < seanph> this is probably ignorant, but CRTs and electron guns come to mind for me when reading about this 07:51 < kanzure> SCOEW video https://www.youtube.com/watch?v=u_Be2awFf0c 07:51 < kanzure> (36sec) 07:51 < fenn> crt's are brighter but lower resolution than modern lcd's for a given area, also it would be annoying to pack a crt monitor into a microscope thingy 07:52 < seanph> well I was not thinking of using phosphors, just charging the substrate directly with the electron gun 07:52 < kanzure> the pattern that appears at time 0m 33sec in that video shows how they move a droplet with an lcd 07:52 < fenn> oh, that doesn't work because the droplets outgas and disrupt the electron beam 07:52 < seanph> can't hit the other side? 07:52 < fenn> but you can do the same thing with a UV laser 07:53 < fenn> glass isn't transparent to e-beam 07:54 < seanph> again, probably ignorant, but if you have a very thin piece of glass (or whatever substrate), then does that matter? 07:54 < fenn> it would improve the resolution, but i don't see how a very thin piece of glass could stand up to vacuum and atmospheric pressure 07:55 < seanph> yeah, that is an issue.. has to be very small I guess, and then it's hard to make 07:56 < fenn> i think there is a lot of untapped potential for e-beam selective resin curing in 3d printing 07:56 < fenn> but that's not the current topic 07:57 < fenn> i love how the droplets shuffle along like chibi totoro 07:58 < seanph> it's very cool 07:59 < seanph> CRTs do get quite small.. I'm playing with a little Russian one right now, and this is even tinier https://www.youtube.com/watch?v=zwdL2gT6844 07:59 < fenn> they're moving a droplet with a dark line that is 5 pixels wide so at most you would need 100 pixels per droplet (but that seems overkill to me) 08:01 < seanph> it would be interesting to play with driving some of those little CRTs and see how narrow one could make the beam - the beam shape is not usually fixed 08:01 -!- wrldpc1 [~ben@hccd37dda2b.bai.ne.jp] has quit [Quit: wrldpc1] 08:01 < fenn> so on an ipad mini display you could have 3,145,728/100 to 3,145,728/9 droplets = 30k to 300k droplets 08:01 < seanph> the old school drive circuitry would have been the limiting factor 08:02 < fenn> the beam shape is not fixed? 08:03 < seanph> usually one applies a voltage to "focus" 08:03 < seanph> and through that one gets a bigger or smaller dot on the screen 08:04 < seanph> old timey drive circuits would've set that to fill up the screen at the resolutions they had available 08:04 < seanph> NTSC or whatever 08:04 < fenn> i see 08:04 < seanph> there obviously must be a lower limit, probably set by the phosphors 08:04 < seanph> but on a monochrome CRT, I think the phosphors are molecular-scale? 08:04 < seanph> (on color they are broken into groups) 08:05 -!- xtalmath [~xtalmath@ip-81-11-174-236.dsl.scarlet.be] has joined ##hplusroadmap 08:05 < seanph> so yeah, there's a chance that if you drove a monochrome CRT differently, you could create a really tiny beam and shine it only where you want 08:07 < seanph> I'm playing with hacking an old Russian one similar to this, so will try it out some time soon I'm playing with one similar to this http://svo.2.staticpublic.s3-website-us-east-1.amazonaws.com/zloshnik/ 08:08 < fenn> crt clocks seem to be getting popular again 08:08 < kanzure> oh so is that how it works http://lh4.googleusercontent.com/-G04PK379SpA/UJRNkYR-h4I/AAAAAAAAMEU/c8S0W8TADSE/s800/image04.jpg 08:08 < seanph> :-D 08:08 < CaptHindsight> many of the old video game arcade monitors were XY vs raster scan 08:09 < fenn> oscilloscope monitors too 08:09 < fenn> probably easier to find 08:09 < seanph> yeah, these were meant for scopes 08:11 -!- yashgaroth [~ffffff@2602:306:35fa:d500:f5e0:f867:a11d:8d52] has joined ##hplusroadmap 08:11 < fenn> wah i miss my analog scope 08:11 < fenn> having to rebuild from scratch is tiresome 08:12 < fenn> i haven't found any good scrap yards either 08:12 < fenn> can't cast metal anywhere 08:13 < seanph> can't you just buy an analog scope on ebay? 08:13 < fenn> i used to have so much stuff i got for nearly free so everything seems unreasonably expensive 08:14 < fenn> $750 for the model i had 08:15 < fenn> anyway i don't really need an analog scope 08:15 < fenn> but i do need building materials and a workshop 08:17 < seanph> http://www.tpub.com/neets/book16/33NP0118.GIF 08:17 < seanph> http://www.circuitstoday.com/wp-content/uploads/2009/09/CRT-Cathode-Ray-Tube.jpg 08:18 < seanph> for those who are curious 08:22 < CaptHindsight> just make your own CRT, electron guns are easy to make and you can use whatever phosphor you want 08:22 < fenn> nmz787 is into all that electron microscope stuff 08:23 < seanph> CaptHindsight: Yeah, for something like this, you'd probably want to make your own ultimately 08:23 < seanph> could be a lot easier than making your own LCD or other micro-array 08:23 < seanph> or aiming lasers that precisely and rapidly 08:24 < CaptHindsight> it also comes down to how good a scrounger you are 08:27 < seanph> wellp, time for sleep here in China - been interesting 08:27 < seanph> I'll be back 08:27 -!- seanph [~seanph@98.126.7.242] has quit [] 08:28 < justanotheruser> china? That seems useful for shipping transhumanist goods. 08:28 -!- seanph [~seanph@98.126.7.242] has joined ##hplusroadmap 08:28 -!- seanph [~seanph@98.126.7.242] has quit [Client Quit] 08:28 < fenn> the point is you don't make an LCD you buy one 08:28 < fenn> they are like $50 08:29 < justanotheruser> Is it racist to assume he's within 10km of a place where you CAN make your own LCD? 08:30 < fenn> hell a whole tablet is $50 these days 08:31 < fenn> justanotheruser: obviously it's racist to ask if something is racist, you racist! be ashamed, be very ashamed! 08:31 < justanotheruser> oh :( 08:35 -!- Viper168 [~Viper@unaffiliated/viper168] has quit [Ping timeout: 255 seconds] 08:46 -!- xtalmath [~xtalmath@ip-81-11-174-236.dsl.scarlet.be] has quit [Quit: Leaving.] 08:49 < kanzure> no but really, wash steps 08:57 -!- PatrickRobotham [uid18270@gateway/web/irccloud.com/x-amswctqhojyxrvqg] has quit [Quit: Connection closed for inactivity] 09:03 < kanzure> here is what the lesswrong crowd is up to :-/ http://rationalfiction.io/ 09:06 < punsieve> I don't see what is ":-/" about this. 09:06 -!- CheckDavid [uid14990@gateway/web/irccloud.com/x-tfpndrdhyunawnsb] has quit [Quit: Connection closed for inactivity] 09:08 < kanzure> punsieve: well they are otherwise a group of people that could be highly productive, but instead they just read HPMOR and pony fanfic over and over again. that is :-/. 09:09 < kanzure> although if they write good scifi i could possibly overlook this... but a terminator fanfiction? not sure. 09:09 < kanzure> terminator fanfic just plays into their "omg ai is taking over the planet and will kill everyone" fears. 09:10 < punsieve> ah, but one could say the same thing about playing video games or watching movies... if this is an outlet, why not? If someone else reads it and learns something, then spiffing. If they don't, then it's no more detrimental (and probably less) than reading some other crap 09:11 < kanzure> theoretically they are not a community oriented around "playing video games" or "reading/writing fanfiction". if they were advertised as such then maybe... but they aren't. 09:13 < punsieve> I'm not familiar with the group's history, I only know of them because of fan fiction, and that is probably true for most others. A cursory glance makes it look like it is a place to post speculative fiction based around manipulating extant fiction universes 09:14 < kanzure> lesswrong is http://lesswrong.com/ 09:14 < kanzure> rationalfiction.io seems to be a direct byproduct of lesswrong's "rationality bootcamps" etc 09:15 < gradstudentbot> Could you get me access to his organs? 09:15 < kanzure> maybe later, gradstudentbot 09:15 < gradstudentbot> I punched my PI and that's why I work here now :\ 09:16 < punsieve> it is a direct byproduct of HPMOR's popularity and a hope that more "stuff" with the same concept will draw more enthusiasts to the cause 09:16 < punsieve> or that is my bet 09:18 < gradstudentbot> I have to read all these articles. 09:19 -!- CaptHindsight [~2020@unaffiliated/capthindsight] has quit [Quit: gone] 09:20 < kanzure> i haven't seen much evidence that that particular strategy has positive results 09:20 < fenn> future scenario: the scene is a dim a windswept wasteland, populated by sparse craggled mechatrees shining icily in the summer light. the ten thousandth generation gradstudentbot sits hunched under a mechatree endlessly reading rational fanfic anthologies 09:20 < gradstudentbot> The paper got rejected. 09:21 < fenn> they want to upload humans to write fanfiction, no joke 09:21 < kanzure> you mean rationality fanfic 09:21 < kanzure> gotta be specific 09:22 < fenn> didn't seem to understand my objections that the market would quickly be saturated 09:22 < kanzure> was this a steve conversation? 09:22 < fenn> no, someone from a rationality bootcamp 09:22 < kanzure> oh right you may have been exposed to these bootcamps 09:22 < kanzure> tell me things 09:23 < fenn> i haven't been there, only heard stories 09:23 < fenn> apparently it's something like an unconference where 30ish people sleep in a house for 3 weeks 09:23 < punsieve> that's a lot of sleep 09:23 < fenn> they wake up and do math problems 09:24 < kanzure> "have you accepted our lord and savior yudkowsky into your heart yet?" 09:24 < kanzure> "no? then back to math problems." 09:24 < fenn> HPMOR is what brought them there in the first place 09:25 < kanzure> someone in the diybio community had plans for a biohacking bootcamp of sorts, where during the 3 weeks you would learn actually useful shit like reverse engineering and semiconductor manufacturing or cell transformation techniques 09:26 < fenn> ugh i have to leave for a while, the smell in here is killing me 09:26 < kanzure> if a fanfic can motivate people to sit around doing math problems maybe it can motivate them to stand around casting metal 09:26 < punsieve> it motivated me to download a lot of science podcasts 09:31 < mgin> topic? 09:32 < mgin> oh. i just finished reading HPMOR all the way through. like the 7th time i've read it overall probably :D 09:32 < punsieve> "...he said in a voice colder than zero Kelvin" that is one of the worst things I have ever read. FYI 09:33 < mgin> oh come on 09:33 < mgin> there are a lot more things to criticize than that 09:34 < mgin> that's not really what it's being judged good for 09:34 < punsieve> it's a RATIONALITY and SCIENCE FRIENDLY fan fiction. And that line made it through to completion? Really? 09:35 < mgin> oh the "below 0" bothers you? geez that pedantic 09:40 < punsieve> Yes. It in fact sheds doubt on everything else the author writes, if he can get that obsessed with his own words as to type that garbage. It is a throwaway line that could easily be improved by changing one word to two, "cold as," and yet no one suggested that? 09:42 < punsieve> ugh, colder than to cold as. Talk about precision of language fail. 09:52 -!- punsieve [~drandomtu@2601:185:8001:fcb0:8d64:18db:270d:5262] has quit [Quit: Leaving] 10:10 -!- Merovoth [~Merovoth@unaffiliated/merovoth] has joined ##hplusroadmap 10:11 -!- wrldpc1 [~ben@hccd37dda2b.bai.ne.jp] has joined ##hplusroadmap 10:21 -!- wrldpc1 [~ben@hccd37dda2b.bai.ne.jp] has quit [Quit: wrldpc1] 10:35 < kanzure> apparently this group did the nightvision chlorin thing http://scienceforthemasses.org/ 10:35 < kanzure> rich lee is still trying to use sensationalism or following the footsteps of sterlac i guess. ugh. http://www.theguardian.com/artanddesign/architecture-design-blog/2015/aug/14/body-hackers-the-people-who-turn-themselves-into-cyborgs 10:36 < kanzure> oh this is the anissimov/rachel schizophrenic thingy. got it. forget i mentioned either of those. 11:29 -!- sheena [~home@S0106c8be196316d1.ok.shawcable.net] has joined ##hplusroadmap 11:31 -!- Houshalter [~Houshalte@oh-71-50-58-200.dhcp.embarqhsd.net] has joined ##hplusroadmap 11:33 -!- Madplatypus [uid19957@gateway/web/irccloud.com/x-ymdopkzvdjwiqfkm] has joined ##hplusroadmap 11:36 < mgin> schizophrenic thingy 11:36 < mgin> ? 11:45 -!- EnLilaSko- [~Nattzor@host-85-30-145-65.sydskane.nu] has joined ##hplusroadmap 11:46 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has quit [Ping timeout: 264 seconds] 11:47 -!- erasmus [~esb@unaffiliated/erasmus] has quit [Read error: Connection reset by peer] 11:51 -!- EnLilaSko- is now known as EnLilaSko 11:51 -!- EnLilaSko [~Nattzor@host-85-30-145-65.sydskane.nu] has quit [Changing host] 11:51 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has joined ##hplusroadmap 11:59 -!- sandeep_ [~sandeep@111.235.64.135] has quit [Quit: Leaving] 12:10 -!- nsh [~lol@wikipedia/nsh] has quit [Excess Flood] 12:10 -!- nsh [~lol@wikipedia/nsh] has joined ##hplusroadmap 12:23 -!- math3 [uid54897@gateway/web/irccloud.com/x-gaevwdiubdorsnsy] has joined ##hplusroadmap 12:24 < maaku> ... and there goes the rest of my productivity for the month 12:26 < mgin> ? 12:39 -!- EnLilaSko- [~Nattzor@85.30.145.65] has joined ##hplusroadmap 12:42 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has quit [Ping timeout: 256 seconds] 12:44 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has joined ##hplusroadmap 12:44 -!- justanotheruser [~Justan@unaffiliated/justanotheruser] has quit [Read error: Connection reset by peer] 12:46 -!- justanotheruser [~Justan@unaffiliated/justanotheruser] has joined ##hplusroadmap 12:46 -!- EnLilaSko- [~Nattzor@85.30.145.65] has quit [Ping timeout: 245 seconds] 12:49 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has quit [Ping timeout: 252 seconds] 13:26 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has joined ##hplusroadmap 13:32 < kanzure> maaku: ultimately crap is much cheaper to create than high quality insight 13:47 < mgin> true 13:47 < kanzure> you still haven't explained why you think HPMOR is a good strategy for immortality 13:48 < mgin> ... 13:48 < mgin> HPMOR is a fictional story 13:49 < mgin> I can't imagine how HPMOR could even be mapped to the concept of a "good strategy for immortality", and have never heard anyway suggest such a thing 13:49 < mgin> anyone* 13:51 < kanzure> calm down, i'm just giving you some of your own shit back to you. don't you remember your original messages in here? 13:51 -!- delinquentme [~delinquen@74.61.157.78] has joined ##hplusroadmap 13:52 < mgin> yeah I remember your shenanigans completely. you seem to have trouble understanding exactly what things the concept of a "good strategy for immortality" could map to 13:52 < mgin> this is yet another example 13:53 < kanzure> ... to hyperlinks? :-) 13:53 < mgin> apparently 13:53 < mgin> asking, "what's a good strategy for achieving immortality" and getting back "HPMOR" is just as much a non-sequitor 13:55 < kanzure> perhaps you should explain your alternative in more detail, since at the moment i still think it might be HPMOR :-) 13:56 < mgin> that's a liberal use of the word 13:57 -!- Viper168 [~Viper@unaffiliated/viper168] has joined ##hplusroadmap 13:58 < mgin> but just to answer because I want to, my approach is similar to Eliezer's very generally speaking, but has very significant differences 13:59 < kanzure> go on 14:00 < mgin> anyway, I agree with his safety concerns, but don't exactly have any confidence he can solve them, or that he would do it correctly even if he thought he could 14:03 < kanzure> so "create an ai and make the ai figure it out"? 14:03 < mgin> not exactly 14:04 -!- sheena [~home@S0106c8be196316d1.ok.shawcable.net] has quit [Ping timeout: 272 seconds] 14:04 < fenn> yo dawg i heard you like meta 14:05 < fenn> so i didn't do anything 14:06 < kanzure> you didn't leave? 14:07 < kanzure> oops i fail, nevermind 14:11 -!- Houshalter [~Houshalte@oh-71-50-58-200.dhcp.embarqhsd.net] has quit [Quit: Quit] 14:11 -!- Houshalter [~Houshalte@oh-71-50-58-200.dhcp.embarqhsd.net] has joined ##hplusroadmap 14:17 -!- Viper168 [~Viper@unaffiliated/viper168] has quit [Quit: Leaving] 14:28 -!- Viper168 [~Viper@unaffiliated/viper168] has joined ##hplusroadmap 14:29 -!- math3 [uid54897@gateway/web/irccloud.com/x-gaevwdiubdorsnsy] has quit [Quit: Connection closed for inactivity] 14:34 -!- xtalmath [~xtalmath@ip-83-134-182-51.dsl.scarlet.be] has joined ##hplusroadmap 15:08 -!- CyberelfJess [~CyberelfJ@ip565f6f48.direct-adsl.nl] has quit [Ping timeout: 246 seconds] 15:15 -!- justanotheruser [~Justan@unaffiliated/justanotheruser] has quit [Ping timeout: 244 seconds] 15:16 < kanzure> "exit rights" 15:17 -!- FourFire [~fourfire@2.148.239.178.tmi.telenormobil.no] has joined ##hplusroadmap 15:21 -!- Houshalter [~Houshalte@oh-71-50-58-200.dhcp.embarqhsd.net] has quit [Quit: Quit] 15:26 < delinquentme> kanzure, NROOO 15:28 < kanzure> acronym not found, please insert girder 15:29 -!- Houshalter [~Houshalte@oh-71-50-58-200.dhcp.embarqhsd.net] has joined ##hplusroadmap 15:36 -!- FourFire [~fourfire@2.148.239.178.tmi.telenormobil.no] has quit [Quit: Leaving] 15:51 < kanzure> someone should deploy pdfparanoia as a proxy to dynamically modify pdfs in flight 15:51 < kanzure> also, it would be interesting to show a proof-of-concept of dynamically rewriting science papers on the fly, like removing certain references, as a demonstration that papers need to be hashed or signed and submitted to a public repository 16:13 -!- fleshtheworld [~fleshthew@2602:306:cf0f:4c20:6195:4696:c9b:17f3] has joined ##hplusroadmap 16:14 < delinquentme> kanzure, if we really wanted to fuck w the world we could edit the versions of PDFs that are uploaded to libgen 16:14 < kanzure> i don't think they are doing uploads anymore :-/ 16:14 < kanzure> gradstudentbot: http://cenonion.blogspot.com/2015/08/acs-official-entire-meeting-schedule.html 16:14 < gradstudentbot> That paper is clearly bullshit. 16:15 < kanzure> .title 16:15 < yoleaux> C&EN Onion: ACS Official: Entire Meeting Schedule Set To Inconvenience Single Graduate Student 16:22 < kanzure> https://www.reddit.com/r/DarkNetMarkets/comments/3dfq8s/dark_net_market_archives_20112015/ 16:23 < kanzure> "I am releasing all my DNM scrapes: a 50GB (~1.6TB) collection covering 89 DNMs & 37+ related forums, representing <4,438 mirrors." 16:25 < gradstudentbot> I don't know what to tell you, I thought I would have graduated by now. 16:28 < kanzure> https://archive.org/download/dnmarchives 16:33 < c0rw1n> want the magnet link? i'm seeding that 17:01 < maaku> mgin: Yudkowskian "AI safety" is a total non-issue 17:01 < maaku> EY worries about AGI because he thinks any AGI will be like AIXI 17:01 < maaku> whereas no human-scale AGI being worked on resembles AIXI in any meaningful way 17:01 < maaku> the concerns EY has make absolutely no sense in the context of real AI programs that are likely to be written 17:01 < maaku> I worry much, much more over Adwords selling booze to alcoholics than I do AIXI destroying humanity 17:03 < Houshalter> maaku, EY's worries have nothing to do with AIXI. They apply to AI in general (also current state of the art AI is a lot like AIXI) 17:04 < maaku> Houshalter: no, they don't. not all AIs are reinforcement learners with undirected search 17:04 < maaku> which is basically what his arguments rely on 17:05 < maaku> and you'll need citations that the current state of the art in AI resembles AIXI -- LIDA, SOAR, CogPrime? 17:05 < Houshalter> maaku, anything that can work in unrestricted open environments, without heavy guidance by the programmers, requires reinforcement learning 17:05 < kanzure> er, do people really call cogprime "state of the art"? 17:06 < maaku> kanzure: I said CogPrime not OpenCog ;) 17:06 < kanzure> but.... hm. 17:06 < maaku> it does represent the state of the art in a certain category of AGI archetectures though 17:06 < gradstudentbot> Where did you put the revisions to the paper? 17:07 < kanzure> maaku: have you had a chance to internalize my ai notes? 17:08 < kanzure> maaku: i was actually going to do an onion routing implementation for lightning network stuff this weekend, but unfortunately i'm still upset about the privacy tradeoffs between network connection graph privacy vs transaction privacy. i can't figure out how to preserve both forms of privacy in a lightning routing strategy. 17:08 < maaku> kanzure: with the block size bullshit? no I've had zero time for anything outside of bitcoin. the above discussion is me trying to escape responsibility on a saturday ;) 17:08 < maaku> i'm also closing on a condo on monday, so that's kept me busy 17:08 < maaku> it's on the top of my AI list though 17:08 < kanzure> it is important that no particular routes are revealed because then someone can just query for all the routes amongst the physical network nodes 17:08 < Houshalter> maaku, i'm more referring to deep learning like deepminds atari playing robot. I have zero worry about SOAR or LIDA becomming AGI 17:09 < Houshalter> but this is really irrelevant. those would be just as dangerous. They don't incorporate human values at all, or solve any of the problems with friendly AI 17:10 < maaku> Houshalter: reinforcement learning plays at least a small part in just about every AGI design, but it is not always central, nor are all AGI architectures explicitly goal driven 17:10 < kanzure> human values aren't that great anyway 17:10 < Houshalter> kanzure, according to who's values? 17:10 < maaku> this human's 17:11 < maaku> Houshalter: then I guess that's where we differ -- I have zero worry about deep learning reinforcement learners doing anything non-trivial 17:11 < Houshalter> maaku, reinforcement learning might be a small part. but it's the part that decides what actions to take and whta values to follow, so it's the important part for AI risk 17:11 < gradstudentbot> Did you order the carbon nanotubes yet? 17:11 < gradstudentbot> Hey, that could be your research project. 17:11 < kanzure> "but you have a conflict of interest!!!" 17:12 < maaku> kanzure: oh gawd not that :P 17:12 < kanzure> check your inbox, haha 17:13 < kanzure> any thoughts about preserving lightning network connection graph privacy? 17:13 < Houshalter> maaku, they are already doing tons of non-trivial stuff. Deep learning has broken thorugh a ton of different hard AI domains. from vision to natural language to video games and board games 17:13 < kanzure> before i waste time on a shit implementation or shit design 17:14 < Houshalter> maaku, but lets talk about AI in general. If an AI isn't goal driven, then there's nothing to worry about. But it also severly limits what it can do. Also what's to stop someone from adding goal driven behavior to it? 17:14 < maaku> kanzure: you should pick rusty's brain on that. i know he's whiteboarded some onion-like pathfinding, but I don't think that's at a code stage yet 17:14 < gradstudentbot> We were out of the right dye, so I just used an equivalent. 17:16 < kanzure> maaku: onion pathfinding would require a central directory of nodes 17:16 < kanzure> maaku: i have talked with rusty and the conversation didn't get anywhere. 17:16 < maaku> kanzure: yeah i missed your messages above. those sound like the same concerns I heard rusty musing over. I don't think it's been reasonably solved, but you should ask on #lightning-dev 17:16 < kanzure> that's where i was talking with rusty. 17:17 < maaku> kanzure: ok sorry - i'll respond over there 17:17 < kanzure> his response was something like "well if you are a lightning network hub then you want the traffic anyway, so it's okay for you to be known by everyone attacking the network" 17:18 < kanzure> i wish rusty would keep his irc client online, heh 17:23 < maaku> yeah get a bouncer already 17:24 < maaku> Houshalter: in many AGI designs it is not the reinforcement learner that makes the decision over what actions to select, or which to take, which was my point! 17:25 < maaku> deep learning has not to my knowledge accomplished anything of significance in the AGI field however. 17:25 < maaku> but maybe you don't subscribe to the notion that AGI != scaled up narrow AI 17:30 < Houshalter> maaku, if you have an AI with goal directed behavior, and it's goals aren't carefully aligned with our own, then it is very likely to be dangerous. That's the TL;DR of AI risk. It's not partial to what AI architecture you are using. 17:31 < maaku> Houshalter: and most reasonable AGI designs are not inherently strongly goal driven. just like people aren't 17:31 < maaku> Houshalter: there's a general result that any entity in a general environment that works to achieve goals, and gets better at achieving those goals, implements reinforcement learning 17:31 < kanzure> maaku: for molecular nanotechnology things, you should keep in mind that we can use custom proteins to selectively bind to already-existing chunks of proteins. there can be very highly constrained binding domains so that cubic proteins only connect to other cubic proteins in very specific ways. the design costs of making custom binding sites like this are sorta outrageous, but you could technically make relatively precise molecular ... 17:31 < kanzure> ... nanostructures this way. 17:32 < kanzure> *are only currently outrageous 17:32 < kanzure> also you might be able to convince a materials person to figure out a way to cast diamondoids from those scaffolds but whok nows 17:32 < kanzure> *who knows 17:33 < Houshalter> maaku, as for deep learning, what AGI benchmarks are there? Language understanding, related to the Turing test, is the only I know of. And they are at least making significant progress in it. 17:33 < maaku> kanzure: right, that's actually what got me interested in synthetic biology. a talk from singularity university showing attaching protiens to a 2-dimensional scafford so that enzymes would stay close to each other 17:34 < maaku> could be very useful 17:34 < kanzure> that was probably biotin or streptavidin 17:34 < maaku> Houshalter: you can't make a benchmark for AGI, sadly 17:36 < Houshalter> maaku, if it's not goal driven, I almost want to say it's not AGI. AI is all about agents 17:36 < maaku> Houshalter: I could name a dozen tasks that an AGI should be able to do. there was a workshop in 2006 i believe in identifying such tasks 17:37 < maaku> but as soon as you name it as a benchmark, a narrow AI task can succeed better at just that problem alone 17:37 < andytoshi> it should be able to define an AGI ;) 17:37 < Houshalter> maaku, for what it's worth, I have something written on why language understanding should be used as a benchmark for AGI. http://houshalter.tumblr.com/post/126023479340/cloze-deletion-test-as-a-measure-of-ai 17:37 < kanzure> i'm pretty sure that i don't understand language 17:38 < Houshalter> but yes, the AI effect is strong. "once a computer can do it, it's not intelligence" 17:38 < kanzure> once a human can do it, it's not intelligence either 17:39 < kanzure> intelligence is worthless anyway; show me an extremely superstitous ai 17:39 < Houshalter> kanzure, those seem to be coherent sentences relevant to the context. what standard are you measuring your language understanding with? 17:39 < maaku> andytoshi: for what it's worth this is a variant of the "PoW for which general purpose computers are the ideal hardware" which you wrote a paper refuting ;) 17:39 < maaku> Houshalter: great. more benchmarks == good. but I guarantee you there is a design in narrow AI space that will outperform any general purpose AI 17:40 < andytoshi> maaku: that's a cool analogy 17:41 < maaku> Houshalter: the AI effect is not what I'm talking about. what I mean is that for any task you can create a special purpose machine which does better at just that task than a general purpose machine 17:41 < andytoshi> the EY argument is that a powerful enough general AI will just create a specialized AI if that's the most efficient thing to do 17:41 < andytoshi> and it'd create it better than we would 17:41 < Houshalter> maaku, AGI will always outperform narrow AI, if they are both optimized towards the same task. Deep blue can compute lots of moves more than a human, but a humans have been able to guide the search of chess engines and do even better 17:41 < maaku> Houshalter: the closest thing I've come to a benchmark is some sort of "does well on ALL the benchmarks, and has a significant Occam weighting" 17:42 < Houshalter> but chess is too limited to begin with. In more open domains AGIs will have bigger and bigger advantages 17:44 < maaku> kanzure: superstitious AGI should arise out of the sort of concept-formation work being done kaj 17:44 < maaku> that's of course not a goal, but something I would expect 17:44 < Houshalter> I think the Turing test can't be beat by narrow AI. maybe limited variants of it. Like that one were some nonexperts talked to a chatbot for 5 minutes, and it pretended it oculdn't speak english well and was just a little kid. That's not even close to what Turing imagined. Turing asked his machines to write sonnets and play chess 17:45 < maaku> you avoid that failure mode by having a drive for accurate and tested models of the world (drive in the PSI sense) 17:45 -!- PatrickRobotham [uid18270@gateway/web/irccloud.com/x-lncvmuoprxflghro] has joined ##hplusroadmap 17:46 < maaku> Houshalter: devil's in the details. define the rules for the test. what's allowed, what's not? do that and a narrow AI can win 17:46 < maaku> but if the rules are "there are no rules", then general AI is necessary 17:46 < maaku> BUT it also becomes useless as a benchmark since each result was subjective and/or conditional on circumstances 17:47 < Houshalter> maaku, the rules are, a few experts talk to the AI and a human for as long as they want, and then predict who is the human. There is little wiggle room. No human can talk to a chatbot for as long as they want and not find it's limits. 17:48 < Houshalter> maaku, as for benchmarks, my cloze deletion test or the hutter prize do well to test language understanding in the interim. 17:50 < maaku> Houshalter: btw i don't want to discourage any work on benchmarks. I am looking at your proposal. I think benchmarks are *very* interesting for getting ground truth, but not for comparing programs or optimizing against 17:51 < kanzure> maaku: did i share my nootropics idea. 17:51 < Houshalter> I think the whole point is to have something to optimize against. 17:51 < maaku> E.g. I'm partial to video game playing as test, like the infinite mario world benchmark. Which is a perfect case in point -- the optimal algorithm here turned out to be ... A* search. Implemented by any first year student 17:52 < kanzure> bonobos seem to do okay with pacman 17:52 < maaku> But if you do your best ot implement a general AI, it's interesting to stick it in the mario world context and see how it does. Even if it never beats braindead A* search. 17:52 < maaku> kanzure: bonobos are awesome :) 17:52 < maaku> kanzure: i think so 17:52 < maaku> nootropics scare me though 17:52 < Houshalter> I don't like the defeatist attitude for the hardcore AGI people. "Oh stupid brute force search beat chess. guess there isn't anything brute force methods can't do." It's jsut silly. There are a massive amount of tasks that AI sucks at right now, because simple methods don't work very well, or can't even begin to attack the problem. 17:52 < kanzure> https://www.youtube.com/watch?v=Rh8gfIcjQNY 17:55 < maaku> Houshalter: you won't find argument from me. I pretty much agree point by point. But I'm of the synergistic / integrative school, like Goertzel, and it's not the majority opinion 17:56 < kanzure> "ah yes, i'm of the same opinion of that dude that wrote wargasm" (i'm kidding.. but wtf.) 17:56 < maaku> I should read that 17:56 < kanzure> if you say so 17:56 < maaku> heh 17:56 < maaku> I'd at least understand what you're talking about then 17:56 < maaku> I think 90% of the work is the various narrow AI components, but the 10% doing integrative work is important. 17:56 < kanzure> it's like if you gave a schizophrenic a whole helping of lsd 17:57 < maaku> Which is to say the secret sauce is there, not in the narrow AI parts. 17:57 < maaku> kanzure: huh 17:57 < kanzure> wargasm, i mean 17:57 < kanzure> i think he was just on some acid trip, who knows 17:58 < maaku> i sometimes wonder 17:58 < maaku> goertzel speaking reminds me of ozzy ozborne 17:58 < kanzure> oh there's no question about it 17:58 < kanzure> he openly talks about his drug use, it's no big deal 17:58 < kanzure> and also not a problem, although wargasm is just... wtf. 17:59 < kanzure> also i guess his aggression against me was unfortunately the cause of me giving up on him 17:59 < kanzure> but that's something else 17:59 < kanzure> (various legal threats were involved from him to me; yadda yadda) 18:00 < maaku> seriously? what over? 18:01 < kanzure> i was doing some volunteer work for him but had to call it quits, he was upset about this 18:01 -!- c0rw1n is now known as c0rw|zZz 18:16 < maaku> he sued you for quitting? wow. 18:16 < maaku> good thing I don't have any idols 18:17 < maaku> (Except Elon Musk -- that's a god, not a man!) 18:18 < maaku> his deconstruction of WebMind is wonderfully full of every single one of the things not to do when starting a company 18:25 -!- erasmus [~esb@unaffiliated/erasmus] has joined ##hplusroadmap 18:33 -!- delinquentme [~delinquen@74.61.157.78] has quit [Ping timeout: 245 seconds] 18:38 < kanzure> he didn't sue me 18:51 -!- JayDugger1 [~jwdugger@108.19.186.58] has joined ##hplusroadmap 18:52 < AmbulatoryCortex> Musk's personal life has suffered pretty bad under his ambition 18:53 -!- JayDugger [~jwdugger@108.19.186.58] has quit [Ping timeout: 250 seconds] 18:54 < mgin> what's a personal life? 18:54 < AmbulatoryCortex> mgin, In Elon Musk's case, that's exactly the problem 18:55 < mgin> why is that a problem? i don't even know what it is 18:56 < AmbulatoryCortex> seriously? 18:56 < mgin> well i'm just asking what goes into that? 18:56 < AmbulatoryCortex> well, his family 18:56 < AmbulatoryCortex> and his divorces 18:57 < kanzure> getting out of a bad relationship can be healthy, even if it requires divorce 18:57 < AmbulatoryCortex> given how he responded afterward, it wasn't a bad relationship 18:58 < mgin> i pretty never talk to my family 18:58 < mgin> and have no relationship :( 18:58 < mgin> pretty much* 18:59 < AmbulatoryCortex> He's a workaholic genius with grand ambitions, and he'll almost certainly be remembered fondly. And he's paying a high price for that. 19:00 < AmbulatoryCortex> mgin, that's unfortunate 19:00 < AmbulatoryCortex> I have a wide support structure in the form of my and my wife's extended families. 19:00 < mgin> what do you talk about? 19:01 < AmbulatoryCortex> well, with my dad, I usually talk about my job, since we're in the same profession 19:01 < AmbulatoryCortex> or rockets, aircraft 19:01 < AmbulatoryCortex> vehicles 19:01 < AmbulatoryCortex> dogs 19:01 < AmbulatoryCortex> whatever happens to be interesting at the time 19:01 < AmbulatoryCortex> paint 19:02 < AmbulatoryCortex> (tough industrial paint that you can put on a truck to make it resistant to driving through brush) 19:03 < AmbulatoryCortex> mom is usually social stuff, or my kids 19:03 < AmbulatoryCortex> brothers are games, computing, or engineering 19:03 < AmbulatoryCortex> stuff like that 19:04 < kanzure> yawn 19:04 < AmbulatoryCortex> other times I'll just fall asleep in a chair at their place after a sunday lunch :P 19:06 < AmbulatoryCortex> hey, describing general conversation isn't particularly enthralling, but he asked 19:07 < mgin> huh 19:16 < fenn> i think i'd rather go to mars 19:24 < mgin> well i'm not trying to make a value judgment 19:24 < mgin> it's just interesting the way people spend their time 19:42 -!- Merovoth [~Merovoth@unaffiliated/merovoth] has quit [] 19:43 -!- c0rw|zZz [~c0rw1n@91.176.86.226] has quit [Ping timeout: 246 seconds] 19:53 -!- erasmus [~esb@unaffiliated/erasmus] has quit [Read error: Connection reset by peer] 20:14 -!- Houshalter [~Houshalte@oh-71-50-58-200.dhcp.embarqhsd.net] has quit [Quit: Quit] 20:15 -!- wrldpc1 [~ben@hccd37dda2b.bai.ne.jp] has joined ##hplusroadmap 20:18 -!- erasmus [~esb@unaffiliated/erasmus] has joined ##hplusroadmap 20:51 < kanzure> various mechanical linkages https://www.youtube.com/user/thang010146/videos 20:54 -!- Mokstar [~Moktar@2601:602:8b00:f420:68c3:a51a:2335:8da8] has joined ##hplusroadmap 20:54 -!- Mokstar [~Moktar@2601:602:8b00:f420:68c3:a51a:2335:8da8] has quit [Changing host] 20:54 -!- Mokstar [~Moktar@unaffiliated/mokstar] has joined ##hplusroadmap 20:54 -!- Mokstar [~Moktar@unaffiliated/mokstar] has left ##hplusroadmap [] 20:55 < kanzure> also there seems to be a forum http://meslab.org/mes/threads/20977-Co-cau-con-truot-banh-rang-lech-tam 20:58 < gradstudentbot> I am busy researching. 21:00 < kanzure> .title https://www.youtube.com/watch?v=g8HKd938yp0 21:00 < yoleaux> Keeping direction unchanged during rotation 9a - YouTube 21:03 < kanzure> .title https://www.youtube.com/watch?v=0dJC8lqa8K0 21:03 < yoleaux> Converting two way linear motion into one way rotation 1 - YouTube 21:05 -!- wrldpc1 [~ben@hccd37dda2b.bai.ne.jp] has quit [Quit: wrldpc1] 21:05 < kanzure> http://www.mediafire.com/download/gqt6wxyoq8wstjw/1700AMMe.z­ip 21:05 < kanzure> er.. 21:05 < kanzure> http://www.mediafire.com/download/gqt6wxyoq8wstjw/1700AMMe.zip 21:06 < kanzure> spam him at thang010146@gmail.com 21:10 -!- drewbot [~cinch@ec2-54-90-92-35.compute-1.amazonaws.com] has quit [Remote host closed the connection] 21:11 -!- drewbot [~cinch@ec2-54-144-70-54.compute-1.amazonaws.com] has joined ##hplusroadmap 21:27 -!- PatrickRobotham [uid18270@gateway/web/irccloud.com/x-lncvmuoprxflghro] has quit [Quit: Connection closed for inactivity] 21:34 -!- CaptHindsight [~2020@unaffiliated/capthindsight] has joined ##hplusroadmap 21:43 -!- CaptHindsight [~2020@unaffiliated/capthindsight] has quit [Quit: gone] 21:53 -!- PatrickRobotham [uid18270@gateway/web/irccloud.com/x-rdzaamtfiqccgeoj] has joined ##hplusroadmap 22:06 -!- wrldpc1 [~ben@hccd37dda2b.bai.ne.jp] has joined ##hplusroadmap 22:21 -!- Viper168 [~Viper@unaffiliated/viper168] has quit [Ping timeout: 250 seconds] 22:41 < JayDugger1> Good morning, everyone. 22:42 < kanzure> night 22:43 < JayDugger1> Fair enough. 22:44 < JayDugger1> I think I'll leave Goertzel's Wargasm off my to-read list. Burroughs-meets-Egan fanfic doesn't rank too high. 22:48 < kanzure> what is greg egan about it? 22:48 -!- jrayhawk [~jrayhawk@unaffiliated/jrayhawk] has quit [Read error: Connection reset by peer] 22:48 < JayDugger1> The mass downloading. 22:49 < JayDugger1> The reality collapse too, (Quarantine, Permutation City). 22:53 -!- AmbulatoryCortex [~Ambulator@173-31-155-69.client.mchsi.com] has quit [Read error: Connection reset by peer] 22:54 -!- yashgaroth [~ffffff@2602:306:35fa:d500:f5e0:f867:a11d:8d52] has quit [Quit: Leaving] 23:07 -!- Viper168 [~Viper@unaffiliated/viper168] has joined ##hplusroadmap --- Log closed Sun Aug 16 00:00:42 2015