--- Day changed Fri May 16 2008 00:16 -!- nsh [n=nsh@87-94-155-173.tampere.customers.dnainternet.fi] has quit [Remote closed the connection] 00:40 < fenn> the reason biotech is expensive is that every last thing is patented multiple times 00:41 < fenn> the only reason it hasn't ground to a halt is that nobody can possibly look at all the patents and figure out who is in violation 00:41 < kanzure> fenn: I just got done reading some of Eli's writings on yudkowsky.net ... why does he get funding? 00:42 < fenn> because nobody else writes seriously about AI ethics 00:43 < kanzure> but it's not a matter of ai 00:43 < kanzure> look, even if you get ai on a box 00:43 < kanzure> it's going to come across the same problems we're fighting with skdb 00:43 < fenn> hmm? 00:43 < kanzure> frankly *we're* ai, just not able to modify our software as quickly of course 00:43 < kanzure> so think about it 00:43 < fenn> no, because we arent able to re-engineer our cognitive architecture 00:43 < kanzure> if you have ai running on a computer, it designs its next generation hardware 00:43 < kanzure> assume it's reached upper limits on software 00:43 < kanzure> so it needs a new ISA 00:44 < kanzure> so it goes to make it, great, then it needs a physical implementation 00:44 < kanzure> how is it going to figure that out ? the same way we would 00:44 < fenn> yes 00:44 < fenn> so? that's just the first generation 00:44 < kanzure> ? 00:44 < kanzure> I'm not saying ai is a bad idea 00:45 < fenn> it's hard to say much of anything.. one ai may be wet and mushy, whereas another might be formalized and brittle 00:45 < kanzure> just that it's going to run into the exact same problems 00:45 < kanzure> it'll need new ways of interfacing with physical reality, just like skdb 00:45 < fenn> but they both run on the same silicon 00:45 < fenn> no, it wont run into the exact same problems, that's stupid 00:45 < kanzure> it'll need arms and manipulators and manufacturing equipment, or at least a text display so that it can tell a human what to do 00:45 < kanzure> okay, so then what is it going to do 00:45 < kanzure> just sit there, compute, 00:46 < kanzure> and suddenly intuit a new architecture out of the divine void 00:46 < kanzure> and suddenly it's magically implemented? 00:46 < fenn> maybe 00:46 < kanzure> wtf? 00:46 < fenn> well, say it reads a few papers and does this DNA FPGA thingie i'm rambling about, and sends them off to get sequenced 00:46 < kanzure> wait, sequenced? 00:46 < fenn> total human interaction: zero so far 00:47 < fenn> un-sequenced 00:47 < kanzure> right, okay 00:47 < kanzure> okay, so you're thinking more about a DNA-FPGA ai there 00:47 < kanzure> and that's implemented on bio 00:47 < kanzure> which is already an exponential process 00:47 < fenn> so then it just has to convince some human to shake up a couple test tubes of stuff and pour it into a beaker with wires dangling in 00:47 < kanzure> so that's the exponential growth hijack scenario 00:48 < fenn> that's just cognitive architecture number 2 00:48 < fenn> well, computational architecture really 00:48 < kanzure> yeah, but that's the hijacking scenario, what about the bootstrap scenario 00:48 < fenn> but i assume the AI is smart enough to take full advantage of FPGA's (whereas humans suck at this) 00:48 < kanzure> the hijack scenario is iffy, it's like GNU Hurd except on steroids - CSAIL has been beating their head against the amorphous computation problem for decades 00:49 < fenn> why do you call it a hijack scenario? 00:49 < kanzure> well, mostly because you used that terminology before 00:49 < kanzure> you were asking about what possible exponential processes there are that we could hijack 00:50 < kanzure> but I think I've probably used it before too 00:50 < fenn> er.. i dont remember using that word 00:50 < fenn> anyway it's clear now what you mean 00:50 < kanzure> anyway, it should be obvious 00:50 < kanzure> yeah 00:51 < fenn> i thought you meant the ai was hijacking a human's actuators to do its bidding 00:51 < kanzure> fun stuff 00:51 < fenn> all ur actuator r belong to us 00:55 < kanzure> on this subject 00:55 < kanzure> http://heybryan.org/pipermail/hplusroadmap/2008-May/000493.html 00:55 < kanzure> that's the Anissimov email, or at least my response to it 01:00 < kanzure> beep 01:00 < kanzure> you stopped. 01:02 < fenn> good exponential intelligence, bad exponential intelligence 01:03 < fenn> if you outlaw microrobots, only criminals will have microrobots 01:03 < kanzure> eh? 01:03 < fenn> > Society goes to hell when I give myself a technology that lets me 01:03 < fenn> > kill hundreds of people undetected (microrobotics, for instance), 01:03 < fenn> > then millions of other people get it, then they all use it. All it 01:03 < fenn> > takes is 1/1000 people to be murderous for this to be a problem. 01:03 < fenn> the proactionary principle turned out to be not what i expected 01:03 < kanzure> what did you think it was? 01:04 < fenn> i was expecting, "instead of worrying about bad possibilities, lets implement safety nets now, before anything happens" 01:04 < kanzure> it's sort of like that - but if anything, the safety nets you do implement, it's more about *your* safety nets, not Societal Blankets 01:04 < fenn> the engineering approach to disaster prevention 01:05 < fenn> i'd feel much more secure having an anti-microbot shield than some wimpy legislation banning microbots 01:06 < kanzure> "The Proactionary Principle recognizes that nature is not always kind, that improving our world is both natural and essential for humanity, and that stagnation is not a realistic or worthy option." 01:06 < kanzure> fenn: re: shield, http://lifeboat.com/ but they need to get their act together 01:06 < kanzure> or we could just fork a spacepod-colony-thing 01:06 < fenn> lifeboat suffers from internet-itis 01:06 < kanzure> hehe 01:06 < fenn> same reason luf failed 01:06 < fenn> the millenial project 01:06 < kanzure> I dunno, I have to wonder how the hell Eric Hunting could fail 01:06 < kanzure> I mean, he seems kind of like me, especially in his lengthy emails 01:07 < fenn> he writes much more coherently 01:07 < kanzure> yes 01:07 < kanzure> maybe he's been over the arguments often? 01:07 < fenn> i dont think so 01:07 < fenn> i've google-stalked him and he mostly writes to the luf-team list 01:07 < kanzure> the proactionary principle - "Let a thousand flowers bloom! By all means, inspect the flowers for signs of infestation and weed as necessary. But don’t cut off the hands of those who spread the seeds of the future. 01:08 < fenn> and his old shelter webpage which is now stagnant 01:08 < kanzure> aka. don't shoot yourself in the foot, you retards 01:09 < kanzure> or more likely - in the head - that's a better modern interpretation 01:10 < kanzure> but really, I don't know if you're following my "ai will not bring about the singularity" line of reasoning or not 01:10 < fenn> heh, you get around http://article.gmane.org/gmane.linux.debian.apt.devel/14488 01:10 < kanzure> do you know how much of a pain in the ass it is to googlestalk myself? 01:10 < kanzure> :( 01:11 < fenn> that was from googling for "eric hunting" 01:12 < kanzure> haha 01:16 < fenn> "People will do the research anyway, just like they code software anyway." unfortunately its much more likely that people get paid to engineer super-viruses rather than doing it for fun 01:17 < fenn> but it will go in some military black budget account vs being out in the open where we can see it (if you cut research spending) 01:17 < kanzure> yikes :( 01:18 < kanzure> there's a part in the email - a quote - that mentions "illegal cures" 01:22 -!- kanzure [n=bryan@cpe-70-113-54-112.austin.res.rr.com] has quit ["Leaving."] 03:35 -!- ybit [n=u1@unaffiliated/ybit] has quit [Read error: 110 (Connection timed out)] 17:36 -!- fenn [n=pz@adsl-99-133-185-36.dsl.bltnin.sbcglobal.net] has joined #hplusroadmap 17:36 -!- Topic for #hplusroadmap: http://heybryan.org/ http://heybryan.org/mediawiki/ http://heybryan.org/exp.html | krebs is now servicing the channel. try !help 17:36 -!- Topic set by kanzure [] [Tue Apr 29 18:54:31 2008] 17:36 [Users #hplusroadmap] 17:36 [ fenn] [ krebs] [ nsh] [ Phreedom] [ ybit] 17:36 -!- Irssi: #hplusroadmap: Total of 5 nicks [0 ops, 0 halfops, 0 voices, 5 normal] 17:36 -!- [freenode-info] please register your nickname...don't forget to auto-identify! http://freenode.net/faq.shtml#nicksetup 17:36 -!- Channel #hplusroadmap created Sat Mar 22 15:44:12 2008 17:37 -!- Irssi: Join to #hplusroadmap was synced in 33 secs 17:44 -!- kanzure [n=bryan@cpe-70-113-54-112.austin.res.rr.com] has joined #hplusroadmap 17:45 < kanzure> Hey. 17:55 < fenn> hi. 17:56 < fenn> the paper knot model got my roomate's 8-yr-old daughter interested in blender 17:56 < kanzure> uh, to what extent 17:56 < kanzure> I would *not* recommend blender for 8 yr olds 17:57 < fenn> heh why not? 17:57 < kanzure> unless they can sit and click for hours and not expect results for a while 17:57 < kanzure> ooh 17:57 < kanzure> screw that 17:57 < kanzure> http://youtube.com/ has videos on blender 17:57 < kanzure> set that kid up with that 17:57 < fenn> she didnt seem interested in me "helping" 17:57 < kanzure> ah, that's good 17:58 < kanzure> http://biobus.org/ - they drive around with equipment in a bus 18:00 < kanzure> if I have to use blender, I think I'll take up the scripting approach 18:13 < kanzure> fenn: have you found any holes in my argument from last night re: ai isn't going to solve the skdb problem? 18:24 < kanzure> gmail is not responsive 18:26 < fenn> yes lots of them, but you arent paying attention 18:27 < fenn> an AI thinks in code, so the whole problem of turning human understanding/knowledge into codified programs doesn't exist 18:29 < fenn> also, since it doesnt have the same type of hardware processing built-in (ex: face recognition) it will come up with different solutions than a human would, but it can still leverage human developments 18:31 < kanzure> "debian science git repository is up and running" hurray (an email I got 12 hours ago apparently) 18:31 < kanzure> sigh 18:31 < kanzure> look, I don't care if it's thinking in code 18:31 < kanzure> that doesn't matter 18:31 < kanzure> the problem would exist even if *we* thought in code 18:32 < kanzure> it still needs to be 'grounded' with instrumentation into physical reality, it still needs to be able to assemble and process the information that really hasn't been assembled yet, and it would need to do actually *do* stuff 18:32 < kanzure> I'm not saying it's impossible for an ai to do it 18:32 < kanzure> I'm just saying that I'm questioning the focus on building an ai first, rather than you know 18:32 < kanzure> doing both things at once 18:51 < fenn> the assumption is that the ai will "bootstrap" itself and become vastly more intelligent than us, so that then manufacturing problems become an insignificant background task 18:52 < fenn> i see that lots of smart people have been working on AI for fifty years now and havent exactly achieved their goals 18:52 < fenn> however, nobody's tried what we're doing to my knowledge 18:57 < kanzure> so you think intelligence solves the manufacturing problem? 18:58 < kanzure> as opposed to aggregation of knowledge that we've acquired by wrestling with experiments and modeling etc. 18:58 < kanzure> I mean, I ultimately hope that intelligence can bruteforce its way out of any hell, any shithole -- believe me, I spend many hours each day in school thinking about this 18:59 < fenn> no, i think the intelligence will undoubtedly find a way to do real empirical experiments in a rapid manner, without having to waste its time writing grant proposals and journal articles 19:00 < kanzure> sure, okay 19:00 < fenn> its not just going to sit in a box and think 19:00 < kanzure> so it needs to be interfaced with the outside world 19:00 < kanzure> kind of like with a manufacturing/peripheral system 19:00 < kanzure> oh wait 19:00 < kanzure> :) 19:00 < fenn> but that doesnt mean you cant accomplish anything at all inside a box 19:00 < kanzure> right, of course 19:00 < kanzure> lots of good models can be made and so on 19:01 < fenn> not models, more like aggregation and formalization of data 19:01 < kanzure> I've found that I've been able to do some good predictive modeling of sorts ... not "here's the situation, now predict" but rather constructing ideas that I later find applicable 19:01 < kanzure> instead of just-in-time learning. 19:01 < fenn> sorting and assimilating information 19:01 < fenn> sound familiar :) 19:02 < kanzure> oh 19:02 < kanzure> it doesn't really matter anyway, the "ai bootstrapping is wrong" idea doesn't matter 19:02 < fenn> hmm 19:02 < kanzure> oh, no, nevermind ... it's to steal Eli's funding ;-) 19:02 < kanzure> I forgot. 19:02 < kanzure> future archivists: I'm half joking. 19:03 < fenn> you know he moved to california from illinois in order to get funding.. 19:03 < fenn> i think that's a large part of it 19:03 < fenn> dear google: 19:03 < fenn> i know that you're young and immature, but i really think you should cut it out with the spelling substitutions 19:03 < fenn> thanks, 19:03 < fenn> -one of your many fleshy carbon-based subjects 19:04 < kanzure> fenn: He was on the extropy-chat mailing list starting in 1999 and then got funding sometime four years later or something 19:04 < kanzure> kind of peculiar 19:04 < kanzure> I think he was doing some writing that got him noticed, I don't know 19:04 < fenn> well, its quite interesting, have you read any of his papers? 19:05 < kanzure> oh, a few 19:05 < kanzure> A Technical Explanation of Technical Explanation 19:05 < kanzure> his intelligence-book 19:05 < kanzure> one on starring into the singularity 19:05 < kanzure> (which talks about big giant numbers and exponential processes ... from the context of ai) 19:06 < fenn> well, the interesting and unique ones are his writing about friendliness 19:07 < kanzure> ah yes, FAI arguments 19:47 < kanzure> holy shit 19:47 < kanzure> Myanmar's cyclone - 100k dead people, 2 million needing some sort of assistance (housing, food, medicine, supplies); - this isn't the "holy shit" part 19:47 < kanzure> international support teams have been deployed, but are stopped at the borders 19:47 < kanzure> the government will not issue visas to let them in 19:48 < kanzure> so how's that for sick 19:48 < kanzure> these people are ready to be deployed 19:48 < kanzure> but they choose not to because "oh, well, we don't have visas" 20:11 < kanzure> fenn: Let's get a little sick. 20:11 < kanzure> In our own way. 20:11 < kanzure> we can do an analysis on how well a clanking replicator could have responded to the disaster 20:12 < fenn> um, no 20:13 < kanzure> as a way to get funding 20:13 < kanzure> within 20 days, at one day replication cycles, there'd be 1 million units 20:13 < fenn> we already have self-replicating general purpose machines and they were stopped at the border because they didnt have visas 20:14 < fenn> otherwise you get into political discussions 20:14 < kanzure> they don't replicate fast enough, and when they *do* replicate, you need to train them for at least 12 to 15 years before they have a clue 20:14 < kanzure> they weren't "stopped" - they respectfully declined to enter without visas 20:14 < kanzure> the military wasn't showing up with tanks and machinery to gun them down 20:14 < fenn> oh they werent? well thats stupid 20:14 < fenn> i mean, more stupid 20:17 < kanzure> fenn: ever read anything by Howard Bloom? Lucifer Principle, or Global Brain, in particular ? 20:19 < kanzure> I'm trying to figure out if I should read them right now. I got them from a friend at the WTA. 20:20 < kanzure> I'm pretty sure the Lucifer Principle is kind of like the Nonzero book (re: the nonzerosumness of collaboration throughout evolutionary history) 20:25 < kanzure> erm, I shouldn't bother asking that 20:25 < kanzure> need to get back to work :) 21:23 < kanzure> well, shit 21:23 < kanzure> http://www.openverse.com/~dtinker/agalmics.html 21:23 < kanzure> I was going to go contact Robert Levin (lilo) to talk about that pape 21:23 < kanzure> *paper 21:23 < kanzure> but then I realized he founded freenode and proceeded to die 21:23 < fenn> yep 21:24 < kanzure> I didn't remember all of the updates from lilo I was getting off of freenode a year back from now 21:24 < kanzure> remembr those? 21:24 < kanzure> *remember 21:24 < fenn> yeah he was a good guy, i dont really understand why so many people hated him 21:24 < kanzure> people hated him? 21:24 < fenn> maybe they were just more vocal about it 21:27 < fenn> too bad the word agalmic never caught on 21:28 < kanzure> it's a useful concept, "marginalization of scarcity" 21:28 < kanzure> although importantly, the knee-jerk reaction is to say "but scarcity is still real, blah blah blah - don't marginalize real problems" 21:29 < fenn> um, no it isnt 21:29 -!- nsh [n=nsh@wikipedia/nsh] has quit [Remote closed the connection] 21:29 -!- nsh [n=nsh@87-94-155-173.tampere.customers.dnainternet.fi] has joined #hplusroadmap 21:30 < kanzure> I mean to say that 'marginalization of scarcity' can be taken to mean 'marginalization of the issue of scarcity' (covering it up) (which this essay is not about, yes, but I refer to the use of the phrase (marginalization of scarcity)) 21:30 < fenn> on that topic, have you read http://www-formal.stanford.edu/jmc/progress/ (particularly the natural resources part) 21:30 < kanzure> no, please hold 21:31 < fenn> its a huge linky behemoth spanning much (for me) new material 21:31 < fenn> interesting site though, i recommend reading all of it 21:31 < fenn> btw he discovered/invented lisp 21:31 < kanzure> haha, token "E.O. Wilson" reference :) 21:32 < fenn> nice 21:33 < fenn> there's a lot of proactionary philosophy on that site 21:34 < kanzure> he has an odd perspective 21:34 < kanzure> he's trying to write for the masses apparently 21:34 < kanzure> well, not "the masses" 21:34 < kanzure> but a wide audience 21:35 < fenn> i think its supposed to be like a 'getting up to speed' page, somethin you can point to for correcting a lot of widespread misunderstandings and baseless fears 21:36 < kanzure> hm 21:36 < kanzure> if he had skdb available at the time, he should have just created new projects for each of his points and jot down a few technical notes on implementation and design 21:36 < kanzure> and then just aggregate all of them together to address the issues 21:37 < kanzure> instead of just leaving us with a static html page that doesn't link to actual *solutions* 21:37 < kanzure> (yes, the words describe solutions, sure) 21:37 < fenn> hmm i dont agree with you on that 21:37 < kanzure> how so 21:37 < fenn> if we try to expand skdb to fulfill every type of information storage and conveyance, it'll be good at nothing 21:38 < kanzure> 'information storage and conveyance' ? 21:38 < kanzure> you mean, the engineering projects that are mentioned on McCarthy's page? 21:38 < fenn> like, people try to use mediawiki for everything, when it's really designed to be an encyclopedia 21:38 < fenn> not a blog, not a photo gallery 21:39 < kanzure> so why wouldn't people use SKDB for projects 21:39 < fenn> if he had nuclear reactor designs on his site, would that be better? 21:39 < kanzure> probably :) 21:39 < kanzure> and those designs would belong in SKDB, IMO 21:40 < fenn> referencing your sources is great but it's not absolutely necessary to get the point across 21:40 < kanzure> true, but part of the whole point is to provide a way for people to get involved 21:40 < kanzure> that's what "getting up to speed" is about 21:40 < kanzure> unlike the news, where you just hear stuff and nod 21:41 < fenn> and how is skdb supposed to store stuff like "Life expectancy in both the rich and poor countries. Infant mortality in rich and poor countries. 21:41 < fenn> Days lost per year due to illness.: 21:41 < kanzure> no, I mean the engineering projects he mentions would go in there 21:41 < kanzure> life expectancy improvement tech would go in there 21:41 < kanzure> and he's free to link over to the solutions 21:41 < fenn> it sounded like you wanted to put the whole webpage in an skdb project 21:41 < kanzure> well ... documentation? 21:42 < kanzure> doesn't that go within a project? 21:42 < fenn> sure, and it should be wiki-able 21:44 < fenn> i'm notoriously bad at citing references (mostly because i can't remember where i learned something) 21:45 < kanzure> :( 21:47 < fenn> heh remember your "squeezing the most out of google programmers" http://www.flickr.com/photos/10719678%40N08/1424289534/in/photostream/ 21:50 < kanzure> heh, squeezing every last drop 22:43 < fenn> this is hilarious, its about bad urban planning and suburbia http://video.google.com/videoplay?docid=-3057280178909051497 23:38 < kanzure> fenn: why doesn't unfold.py work for me? 23:39 < kanzure> it generates SVG, but I don't get anything in the SVG file 23:39 < fenn> it's very small 23:39 < fenn> select all, then zoom to selection 23:41 < kanzure> select all what ? 23:41 < fenn> ctrl-a 23:42 < fenn> then click on magnifying glass with a dashed box in it 23:42 < fenn> or ctrl-drag on the handles to proportionally scale it up 23:43 < kanzure> I'm getting nothing. 23:43 < kanzure> the file is nonempty 23:44 < kanzure> export to png also shows nothing 23:48 < kanzure> hrm, this sucks, I'm going to have to learn blender 23:48 * kanzure goes off to youtube to hear some annoying voices 23:49 < fenn> in object mode, right-click on what you want to unfold, then run mesh->unfold, select 'curvature' and 'search' and click unfold, then save 23:49 < fenn> doncha love GUI 23:50 < fenn> why does LUF attract so many wingnuts 23:53 < kanzure> many groups that use yahoo for mailing lists, are generally attractors of wingnuts for some reason 23:53 < kanzure> just in general. 23:53 < kanzure> orions_arm is an interesting exception