--- Log opened Wed Aug 29 00:00:10 2012 00:09 < foucist> delinquentme: 11:19 < nmz787> delinquentme: I would use SCP 00:13 < delinquentme> that was the last activity?? 00:13 < foucist> delinquentme: right after you asked how to send a 2gb file 00:14 < delinquentme> Ohhh 00:16 < foucist> delinquentme: personally i use the alias: scpresume 00:16 < foucist> scpresume='rsync --partial --progress --rsh=ssh' 00:16 < foucist> just in case the file gets interrupted 00:16 < foucist> scp doesn't have resume itself 00:16 < foucist> though wget -c is pretty good too 00:16 < foucist> asuming the file was on a webserver :P 00:17 < foucist> funny how ftp clients already solved the problem a long time ago 00:17 < foucist> but i havne't used ftp in YEARS 00:18 < foucist> well, probably not since i used windows 00:18 -!- joshcryer [g@unaffiliated/joshcryer] has quit [] 00:42 -!- archels [~foo@sascha.esrac.ele.tue.nl] has quit [Ping timeout: 252 seconds] 00:52 -!- archels [~foo@sascha.esrac.ele.tue.nl] has joined ##hplusroadmap 00:59 -!- minimoose [~minimoose@pool-173-75-216-239.phlapa.fios.verizon.net] has quit [Quit: minimoose] 01:07 < delinquentme> NIGHT! 01:07 -!- delinquentme [47ec6527@gateway/web/freenode/ip.71.236.101.39] has quit [Quit: Page closed] 01:13 -!- obscurite [~obscurite@danielpacker.org] has quit [Ping timeout: 276 seconds] 01:26 -!- Netsplit *.net <-> *.split quits: superkuh 01:29 -!- drazak_ [~ahdfadkfa@199.188.72.84] has quit [Ping timeout: 245 seconds] 01:30 -!- obscurit1 [~obscurite@danielpacker.org] has joined ##hplusroadmap 01:54 < archels> http://www.urbandictionary.com/define.php?term=Hacker%20Spaces 01:59 -!- obscurit1 [~obscurite@danielpacker.org] has quit [Ping timeout: 244 seconds] 02:00 -!- obscurite [~obscurite@danielpacker.org] has joined ##hplusroadmap 02:00 < Utopiah> looks accurate 02:01 -!- AdrienG [~ircname@unaffiliated/amphetamine] has joined ##hplusroadmap 02:04 -!- AdrianG [~ircname@unaffiliated/amphetamine] has quit [Ping timeout: 244 seconds] 02:18 -!- obscurite [~obscurite@danielpacker.org] has quit [Ping timeout: 245 seconds] 02:26 -!- obscurit1 [~obscurite@danielpacker.org] has joined ##hplusroadmap 02:30 -!- hankx7787 [181e2e48@gateway/web/freenode/ip.24.30.46.72] has joined ##hplusroadmap 02:32 -!- obscurit1 [~obscurite@danielpacker.org] has quit [Ping timeout: 268 seconds] 02:32 -!- hankx7787 [181e2e48@gateway/web/freenode/ip.24.30.46.72] has quit [Client Quit] 02:34 -!- drazak_ [~ahdfadkfa@199.188.72.84] has joined ##hplusroadmap 02:40 -!- obscurit1 [~obscurite@danielpacker.org] has joined ##hplusroadmap 03:50 -!- sylph_mako [~mako@168.3.252.27.dyn.cust.vf.net.nz] has quit [Ping timeout: 248 seconds] 03:53 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has joined ##hplusroadmap 04:32 -!- superkuh [~superkuh@unaffiliated/superkuh] has joined ##hplusroadmap 04:34 -!- superkuh [~superkuh@unaffiliated/superkuh] has quit [Remote host closed the connection] 04:35 -!- tashoutang [~tata@pc131090206.ntunhs.edu.tw] has quit [Ping timeout: 244 seconds] 04:44 -!- ThomasEgi [~thomas@panda3d/ThomasEgi] has joined ##hplusroadmap 04:48 -!- superkuh [~superkuh@unaffiliated/superkuh] has joined ##hplusroadmap 04:49 < strangewarp> P2P Foundation has a maddive hard-on for Carrico, all of a sudden... Looks like their main blogger has been captured by the "social justice before technology" thing 04:49 < strangewarp> massive* 04:49 * strangewarp unfollows that blog 05:04 < strangewarp> I was momentarily tempted to write an essay about how Carrico's opposition of globalization technologies impedes the formation of international workers' solidarity, which means his leftist credentials are short-sighted at best... but then I remembered he's a dang troll 05:41 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has joined ##hplusroadmap 05:46 < nmz787> kanzure: jrayhawk: i have this dir... d-wx-ws--x 5 root staff 4096 Apr 20 01:33 djangoApps 05:46 < nmz787> when i type groups, two things show up, staff and sudo 05:46 < nmz787> so why can't I even ls the dir as my user? 05:48 < chris_99> don't you need read permissions 05:50 < nmz787> derr 05:50 < nmz787> yes 05:50 < nmz787> thx 06:07 -!- chido [chidori@pasky.or.cz] has joined ##hplusroadmap 06:30 -!- minimoose [~minimoose@74-95-191-59-Philadelphia.hfc.comcastbusiness.net] has joined ##hplusroadmap 06:33 < OldCoder> minimoose, Good nick 06:41 < minimoose> tx, OldCoder 06:47 -!- jmil [~jmil@hive76/member/jmil] has quit [Quit: jmil] 07:31 -!- jmil [~jmil@hive76/member/jmil] has joined ##hplusroadmap 08:05 -!- soylentbomb [~k@d149-67-118-140.col.wideopenwest.com] has joined ##hplusroadmap 08:13 -!- ThomasEgi [~thomas@panda3d/ThomasEgi] has quit [Remote host closed the connection] 08:19 < nmz787> anyone here know SVN? 08:29 < bkero> for a certain definition of know 08:30 < bkero> nmz787: chmod g+x djangoApps 08:30 < bkero> s-s-s-sticky bit 08:30 < nmz787> bkero: that was already done 08:31 < nmz787> svn status says the dir is locked tho 08:31 * bkero doesn't deal with svn locks 08:34 -!- minimoose [~minimoose@74-95-191-59-Philadelphia.hfc.comcastbusiness.net] has quit [Quit: minimoose] 08:45 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has joined ##hplusroadmap 08:46 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has quit [Ping timeout: 246 seconds] 08:59 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has joined ##hplusroadmap 09:06 -!- archels [~foo@sascha.esrac.ele.tue.nl] has quit [Ping timeout: 256 seconds] 09:08 <@kanzure> death to svn 09:15 -!- archels [~foo@sascha.esrac.ele.tue.nl] has joined ##hplusroadmap 09:25 <@kanzure> "A hundred billion or so humans have ever lived, but only seven billion are alive now (which gives the human condition a 93% mortality rate)." 09:28 -!- Falfe [~not@c83-251-81-162.bredband.comhem.se] has joined ##hplusroadmap 09:38 -!- chido [chidori@pasky.or.cz] has quit [Read error: Connection reset by peer] 09:42 -!- augur [~augur@208.58.5.87] has quit [Remote host closed the connection] 09:49 -!- chido [chidori@pasky.or.cz] has joined ##hplusroadmap 09:58 -!- pasky [~pasky@nikam.ms.mff.cuni.cz] has quit [Ping timeout: 240 seconds] 09:58 -!- audy [~audy@unaffiliated/audy] has quit [Quit: ZNC - http://znc.sourceforge.net] 09:58 -!- pasky [~pasky@nikam.ms.mff.cuni.cz] has joined ##hplusroadmap 09:58 -!- audy [~audy@heyaudy.com] has joined ##hplusroadmap 10:06 -!- SDr|Berlin is now known as SDr|London 10:07 -!- skorket [~skorket@cpe-24-58-232-122.twcny.res.rr.com] has quit [Quit: Leaving] 10:14 -!- ThomasEgi [~thomas@panda3d/ThomasEgi] has joined ##hplusroadmap 10:16 -!- augur [~augur@216-164-54-41.c3-0.slvr-ubr1.lnh-slvr.md.cable.rcn.com] has joined ##hplusroadmap 10:18 < nmz787> soo ACS Synthetic Biology is now offered in print 10:18 < nmz787> that's stupid 10:19 < nmz787> how am i suppoesd to be a synBio fanboy if i can't get some physical instruments 10:19 < nmz787> but members get to choose 5 titles with their subscription 10:20 < nmz787> so 25 bucks a year for some ACS journals isn't too bad 10:20 < nmz787> but i really wanted print 10:22 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has quit [Read error: Connection timed out] 10:22 -!- augur [~augur@216-164-54-41.c3-0.slvr-ubr1.lnh-slvr.md.cable.rcn.com] has quit [Ping timeout: 268 seconds] 10:23 -!- hifrog [~swamp@p5B16D6F4.dip.t-dialin.net] has joined ##hplusroadmap 10:23 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has joined ##hplusroadmap 10:25 <@kanzure> http://coding.fm/ i like the "angry dev coding" mode (it's just "sounds of angry coders typing on keyboard") 10:25 <@kanzure> they need to throw in some "FUCK"s and "SHIIIT"s 10:25 < chido> I have that live at home 10:28 <@kanzure> the at-home version exhausts your resources of food and drink 10:29 < nmz787> any opinions on http://phabricator.org 10:29 <@kanzure> i'd look at redmine first probably 10:30 <@kanzure> i'm not too hawt on running php 10:30 <@kanzure> a lot of people just use github or unfuddle or something 10:31 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has quit [Ping timeout: 244 seconds] 10:32 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has joined ##hplusroadmap 10:40 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has quit [Ping timeout: 256 seconds] 10:44 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has joined ##hplusroadmap 10:44 < chris_99> if i'm going to get a book on microbiology 10:44 < chris_99> what would you guys recommend 10:45 < chris_99> *molecular biology 10:48 < nmz787> why doesnt this work find / -name svn2|grep -v 'denied' 10:49 < nmz787> mol bio of the gene 10:49 < nmz787> mol bio of the cell is supposed to be good too 10:50 <@kanzure> nmz787: do you mean "find ./" 10:51 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has quit [Ping timeout: 268 seconds] 10:51 < nmz787> no 10:51 <@kanzure> chris_99: the wikibook on microbiology is surprisingly awful http://en.wikibooks.org/wiki/Microbiology 10:51 < nmz787> grep isn't finding denied 10:51 <@kanzure> -v is to exclude 10:51 < nmz787> chris_99 this is better prob http://www.google.com/url?sa=t&rct=j&q=microbio%20book&source=web&cd=13&cad=rja&ved=0CGUQFjAM&url=http%3A%2F%2Ftextbookofbacteriology.net%2F&ei=p1Y-UJ-wFMHW0QGs-oDICw&usg=AFQjCNFNPmWiNaFLBuTdBWxcAJOctFU0lg 10:51 <@kanzure> and this sounds like you want "locate". your system is probably running locatedb. this would be faster than running find on your whole file system. 10:52 < nmz787> google sucks 10:52 < nmz787> textbookofbacteriology.net/ 10:52 < nmz787> yes i want to exclude 10:52 < nmz787> that is why i give -v 10:52 < nmz787> but it doesnt exclude 'denied' 10:52 < nmz787> it still shows up on STDOUT 10:52 <@kanzure> because of how find works, i think 10:52 < nmz787> :/ 10:52 < nmz787> grr 10:52 <@kanzure> locate svn2 | grep -v denied 10:53 < nmz787> that immediately returns with no results 10:53 < nmz787> i just installed svn2git 10:53 < nmz787> and it cant find it 10:53 < nmz787> and i wanna search for the sv2git file 10:53 <@kanzure> what is svn2git? 10:53 <@kanzure> git-svn is provided by default with git i think 10:53 <@kanzure> type: man git-svn 10:54 <@kanzure> that textbook doesn't look very thorough.. it goes over some metabolism stuff but it doesn't seem to mention streaking, plating, or things like that 10:55 < nmz787> https://github.com/nirvdrum/svn2git 10:55 < nmz787> there is nothing called git-svn 10:55 < nmz787> after installing some stuff git svn works 10:56 < nmz787> with a space 10:56 < nmz787> not dash 10:57 < nmz787> is streaking really microbiology though? 11:00 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has joined ##hplusroadmap 11:00 < jrayhawk> '2|' is not a valid stream redirection 11:01 < jrayhawk> you can use |& as shorthand for 2>&1 | 11:01 < jrayhawk> at least, in bash 11:01 < nmz787> no i want to search for 'svn2' 11:01 < nmz787> i didnt think a space was rewuired 11:01 < nmz787> required 11:01 < jrayhawk> oh, well, use 2>&1 | to get stderr into stdout 11:01 < jrayhawk> It isn't. 11:01 < nmz787> why is find going to put results in stderr? 11:02 < jrayhawk> you seemed to be grepping for 'permission denied', which is, notably, an error. 11:02 <@kanzure> what is the "ncbi C++ toolbox"? http://www.ncbi.nlm.nih.gov/IEB/ToolBox/CPP_DOC/ seems to have BLAST, fastcgi, some weird xml crap.. bleh 11:02 < nmz787> oh 11:02 < nmz787> kanzure: yes 11:02 < nmz787> its just bindings 11:02 <@kanzure> i see 11:02 < nmz787> i use it through biopython 11:03 <@kanzure> so um, hey, what's the point of ncbi.nlm.nih.gov/books if the book contents can't be displayed? http://www.ncbi.nlm.nih.gov/books/NBK21154/ "By agreement with the publisher, this book is accessible by the search feature, but cannot be browsed" 11:04 < jrayhawk> presumably so that you can find them with searches? 11:04 < nmz787> you can see stuff if you search 11:04 < nmz787> so you could prob search streaking 11:09 -!- augur [~augur@129-2-129-33.wireless.umd.edu] has joined ##hplusroadmap 11:10 <@kanzure> jrayhawk: when i try to clone a private repo on gnusha i get "fatal: failed to open '/srv/git/enzyme-filter.git/objects/17': Permission denied" 11:10 <@kanzure> (i addaccess'd myself) 11:12 < jrayhawk> You may need to relogin in order to get the new groups 11:14 <@kanzure> doh 11:15 <@kanzure> nmz787: the only commit message in there is "Convert svn:ignore properties to .gitignore." 11:16 < nmz787> can you delete everything, i guess i'll jus push the files 11:16 < nmz787> this git to svn seems to not work 11:16 <@kanzure> you can do "git push -f" if you want to override the crap already there 11:16 <@kanzure> you don't want git->svn.. you want svn->git 11:16 < jrayhawk> receive.denynonfastforwards might be on 11:16 < nmz787> well there arent any files in the dir the stupid git svn command made 11:17 < nmz787> oh, right 11:17 < nmz787> well it sucks no matter 11:17 <@kanzure> jrayhawk: stat( /srv/git/enzmye-filter.git/objects ) failed: No such file or directory at /usr/share/perl5/Piny/Repo.pm line 601. 11:17 < nmz787> i just want something that easily does diff/patch 11:17 < nmz787> geez 11:18 < jrayhawk> 'enzmye' 11:25 -!- augur [~augur@129-2-129-33.wireless.umd.edu] has quit [Remote host closed the connection] 11:26 <@kanzure> oh fooey 11:26 <@kanzure> nmz787: ok i made a terrible typo 11:26 < nmz787> oh 11:27 <@kanzure> nmz787: ok now "git push -f" will work 11:27 <@kanzure> what did you do to convert it? 11:29 < chris_99> it's £185 for 100ug of GFP, isn't that rather expensive?! 11:31 < nmz787> git svn clone 'svn+ssh://secretrepo' --no-metadata -A authors-transform.txt --stdlayout ~/temp 11:31 < jrayhawk> looks like he's following http://john.albin.net/git/convert-subversion-to-git but something is going wrong with git svn 11:31 < nmz787> the shit here http://john.albin.net/git/convert-subversion-to-git 11:32 < nmz787> yes 11:32 < nmz787> so in Oregon you can ask a doctor to sterilize you without parental consent at 15 11:32 < nmz787> pretty cool 11:33 < nmz787> there are no files in the resulting temp dir 11:33 < nmz787> except for a .git dir 11:33 < jrayhawk> Which kind of sterilization? 11:33 < nmz787> sexual 11:33 < jrayhawk> i mean, more specifically 11:33 < nmz787>       (4) “Sterilization” means any medical procedure, treatment or operation for the purpose of rendering an individual permanently incapable of procreating. [1983 c.460 §3; 1991 c.67 §116] 11:34 < nmz787> from Definitions here http://www.leg.state.or.us/ors/436.html 11:34 < nmz787> wait 11:34 < nmz787> .us 11:34 < nmz787> same as diyhpl.us 11:34 < jrayhawk> holy moses 11:34 < nmz787> kanzure are you posing as Oregon? 11:34 <@kanzure> .us is a generic "united states" domain 11:34 <@kanzure> *gtld 11:34 < jrayhawk> i don't think even planned parenthood would give a 15 year old a hysterectemy 11:34 < nmz787> go Oregon 11:35 < jrayhawk> or a tubal ligation 11:35 < nmz787> though I suspect Oregon is not on the top of the over-populators list 11:35 <@kanzure> so do they have to be a resident? 11:35 < jrayhawk> well, if they cross state lines, then it becomes a federal issue 11:36 <@kanzure> jrayhawk: maybe you know an easier way for nmz787 to convert his svn repository 11:36 <@kanzure> nmz787: what's that molecular biology protocols book that everyone keeps recommending? 11:37 < jrayhawk> i suspect it's the wrong path or permissions are still wrong or something and git-svn doesn't give sufficient error messages 11:38 < nmz787> wish this was on android https://www.facebook.com/TheSimpsonsTappedOut 11:38 < nmz787> no this http://itunes.apple.com/us/app/id497595276?mt=8 11:38 < jrayhawk> i need to take off, so i probably won't be of much help for at least a couple hours 11:38 < nmz787> maniatas 11:39 < nmz787> molecular cloning a lab manual... by sambrook and russel 11:39 < nmz787> i guess maniatis was on older editions 11:40 < nmz787> this sounds like a PITA http://codeascraft.etsy.com/2011/12/02/moving-from-svn-to-git-in-1000-easy-steps/ 11:40 < nmz787> i want 1 or 2 steps 11:40 <@kanzure> oh it's sambrook 11:40 <@kanzure> "Molecular Cloning: A Laboratory Manual" 11:41 <@kanzure> right 11:41 <@kanzure> ok thanks 11:44 < nmz787> yeah i think the git svn is silently failing 11:44 <@kanzure> does "svn log" work? 11:46 < nmz787> ok looks like removing the -A authors-transform.txt --stdlayout is working 11:46 < nmz787> i actually see files being downloaded 11:47 <@kanzure> did you make authors-transform.txt? 11:47 <@kanzure> or did you neglect that part :P 11:48 < nmz787> yes 11:48 < nmz787> i made it 11:48 <@kanzure> hrrm 11:48 <@kanzure> well ok 11:49 < nmz787> it was a 1 line file 11:49 < nmz787> nathan = nathan 11:49 <@kanzure> haha i think you should've used "Nathan McCorkle " 11:53 < nmz787> well i'm not using it at all now 12:00 < nmz787> gah, it died 12:00 < nmz787> r16 = 94559e0d1272d17d0fa5f800855c89cc4071fe54 (refs/remotes/git-svn) 12:00 < nmz787> fatal: refs/remotes/trunk: not a valid SHA1 12:00 < nmz787> update-ref refs/heads/master refs/remotes/trunk: command returned error: 128 12:03 <@kanzure> that looks like a valid SHA1 to me.. 12:07 < nmz787> rf -Rfing and trying again 12:09 < gnusha> diyhpluswiki.git: 641e37c links to actual books 12:10 < nmz787> why do some of the files it shows during check say D 12:10 < gnusha> diyhpluswiki.git: 101c23a remove pesky inconsistency in book labelling 12:10 < nmz787> if i made them D (deleted) why are they showing up in a new checkout??? 12:10 <@kanzure> because you didn't svn commit your deletions 12:11 < nmz787> wtf 12:11 <@kanzure> (probably) 12:11 < nmz787> i said commit 12:11 <@kanzure> i don't know how svn works 12:11 <@kanzure> the only time i use it is when i'm converting it to git 12:11 <@kanzure> i have added some possibly-useful files here http://diyhpl.us/wiki/diybio/faq/books 12:12 <@kanzure> as far as i can tell there's no actually-useful free molecular biology book out there on the interwebs 12:13 < nmz787> the ones i like are free outside US copyright law 12:14 < nmz787> you should add mol bio of the gene 12:17 -!- eridu [~eridu@gateway/tor-sasl/eridu] has joined ##hplusroadmap 12:18 <@kanzure> ok grabbing 12:23 < gnusha> diyhpluswiki.git: 095952d add a link to Molecular Biology of the Gene 12:29 -!- SDr|London is now known as SDr 12:31 -!- eridu [~eridu@gateway/tor-sasl/eridu] has quit [Quit: Leaving] 12:31 <@kanzure> actually i think my copy of mbotg is bad.. doesn't seem to be a pdf, can't figure out what type of file it is 12:31 < nmz787> so i tried again, this time adding --username myusername 12:31 < nmz787> i dunno if the rm -Rf did it, or that 12:31 < nmz787> but it worked 12:32 < nmz787> so now 12:33 < nmz787> shit, just boiled a pot of water dry 12:36 < nmz787> ok kanzure the repo should be all pushed 12:38 < nmz787> hmm, beer or coffee 12:40 <@kanzure> sambrooks is pretty awesome, why haven't i read this yet? 12:45 < nmz787> dunno 12:46 < nmz787> late to the game 12:46 < nmz787> found that PDF after being in school maybe 6 months 12:46 < nmz787> referred to it quite often for background research 12:46 < nmz787> one lab i worked in actually had a paper copy, that was nice too 12:47 < nmz787> i doubt most other students used it much 12:48 < nmz787> kanzure: did you ever make an equipment auction app? 12:48 -!- augur [~augur@129-2-129-33.wireless.umd.edu] has joined ##hplusroadmap 12:48 <@kanzure> nope not yet 12:48 < nmz787> i wonder if there is cheap stuff within an hour of NYC 12:49 -!- drazak__ [~ahdfadkfa@199.188.72.84] has joined ##hplusroadmap 12:49 -!- drazak__ [~ahdfadkfa@199.188.72.84] has quit [Client Quit] 12:50 <@kanzure> i want to find a book like sambrooks except for motion control automation things 12:51 <@kanzure> (and not something insulting like "here's some gear equations, now piss off") 12:51 < nmz787> hmm 12:51 < nmz787> i can email someone 12:58 -!- EnLilaSko- [~Nattzor@m77-219-191-209.cust.tele2.se] has joined ##hplusroadmap 12:59 -!- EnLilaSko [~Nattzor@unaffiliated/enlilasko] has quit [Ping timeout: 248 seconds] 13:15 <@kanzure> nmz787: do you have a pdf copy of cathal's wikibook? 13:15 <@kanzure> every time i try to render it on wikipedia it ends up with an error 13:15 <@kanzure> i've told cathal about this, but he can't fix it 13:16 <@kanzure> http://pastebin.com/raw.php?i=AdJg9gwv 13:16 < nmz787> no 13:17 < nmz787> can you complain? 13:17 < nmz787> submit bug? 13:18 <@kanzure> not sure where to submit this bug 13:19 <@kanzure> i guess i should complain to aran dunkley 13:20 <@kanzure> "Multibyte characters not working" 13:29 -!- drazak__ [~ahdfadkfa@199.188.72.84] has joined ##hplusroadmap 13:29 -!- drazak__ [~ahdfadkfa@199.188.72.84] has quit [Client Quit] 13:39 < nmz787> kanzure: when do you have time to look at that enzymeFilter code? 13:39 < nmz787> with me 13:41 <@kanzure> i can look at it in a few minutes (it's downloading but i'm on a slow connection) 13:42 < nmz787> cool 13:46 -!- jmil [~jmil@hive76/member/jmil] has quit [Quit: jmil] 13:46 -!- jk4930 [~jk@p57B73587.dip.t-dialin.net] has joined ##hplusroadmap 13:46 -!- strangewarp [~strangewa@c-76-25-200-47.hsd1.co.comcast.net] has quit [Read error: Connection reset by peer] 13:47 <@kanzure> nmz787: so what are all these multi-megabyte files in the main directory? 13:47 <@kanzure> like CAZyall.faa 13:48 <@kanzure> whog 13:48 -!- strangewarp [~strangewa@c-76-25-200-47.hsd1.co.comcast.net] has joined ##hplusroadmap 13:48 <@kanzure> KEGG_KO_list 13:49 <@kanzure> nmz787: i've pushed a minor change (git pull it) 13:49 < nmz787> they are... lists 13:49 < nmz787> what you do? 13:49 < nmz787> CAZy is carbohydrate active enzymes or something like that 13:49 <@kanzure> i removed a small svn file 13:49 < nmz787> KEGG is some japanese database/ontology AFAIK 13:50 < nmz787> KO is KEGG ORTHOLOG 13:50 <@kanzure> what is the difference between enzymeFilter/views.py.new and enzymeFilter/views.py.old ? 13:50 <@kanzure> and enzymeFilter/views.py 13:50 < nmz787> umm 13:51 < nmz787> i dunno if the dates are there or not 13:51 < nmz787> but the .new .old are both not active 13:51 < nmz787> so I'm not sure but they can be deleted most likely 13:51 < nmz787> i stopped using svn after I forgot it last summer 13:51 <@kanzure> ok pushed 13:52 -!- srangewarp [~strangewa@c-76-25-200-47.hsd1.co.comcast.net] has joined ##hplusroadmap 13:52 < nmz787> brb, lemme switch to linux pidgin in VM 13:52 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has left ##hplusroadmap [] 13:53 -!- strangewarp [~strangewa@c-76-25-200-47.hsd1.co.comcast.net] has quit [Disconnected by services] 13:53 -!- srangewarp is now known as strangewarp 13:54 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has quit [Quit: Leaving] 13:57 -!- nmz787 [~nathan@ool-45792f2b.dyn.optonline.net] has joined ##hplusroadmap 13:57 <@kanzure> nmz787: i think your first step is to switch out postgresql in settings.py with sqlite 13:57 <@kanzure> second step is to assemble a requirements.txt file to list out the dependencies 13:58 <@kanzure> this way you can re-install this app in different environments 13:58 < nmz787> why go to sqlite? 13:58 <@kanzure> well, unless you want to install postgresql on your system 13:58 < nmz787> well isn't it faster and more robust? 13:58 <@kanzure> requirements.txt is necessary so that you know which version of django you're using, or which version of the other packages 13:59 < nmz787> oh, its the newest version of django, 3 something i think 13:59 <@kanzure> postgresql is definitely better for production, but sqlite is really nice for development (unless you're dealing with huge databases..) 13:59 < nmz787> grr, i had that all written somewhere, once 13:59 <@kanzure> i thought you said it was from last year? 13:59 < nmz787> well i upgraded to the newest django about 5 months ago 14:00 <@kanzure> do you have virtualenv stuff already installed? 14:00 < nmz787> never use virtualenc 14:00 < nmz787> env 14:00 < nmz787> never saw the reason 14:00 <@kanzure> i suggest doing this: 14:00 <@kanzure> curl -s https://raw.github.com/brainsik/virtualenv-burrito/master/virtualenv-burrito.sh | bash 14:00 <@kanzure> this way you can switch into a virtualenv just for this project 14:00 < nmz787> why? 14:00 <@kanzure> and keep track of the dependencies 14:00 < nmz787> the VM i'm on is only for doing this 14:00 <@kanzure> because then you can commit requirements.txt and just push your git repository around 14:00 <@kanzure> when you want to deploy. 14:01 < nmz787> likewise, the server its on is only for this app 14:01 <@kanzure> yes, but you're not going to push this back to jbei 14:01 <@kanzure> presumably you will want to deploy this elsewhere 14:01 < nmz787> if you help me get it working i will 14:01 < nmz787> hmm 14:01 < nmz787> well, sure 14:01 <@kanzure> exactly. so using requirements.txt is how you're going to get it to work. 14:01 < nmz787> but why use virtualenv? 14:01 < nmz787> why should the app be as easy to install along whatever system? 14:02 <@kanzure> because then you don't need me in the future 14:02 <@kanzure> brownies: help me :( 14:02 < brownies> kanzure: ? 14:03 < nmz787> i don't follow, i need your help with a file parser mainly, and figuring out if there's a better way to background processes than django-celery 14:03 <@kanzure> brownies: i'm trying to describe why something like rvm and bundler is useful, except in the context of django 14:03 <@kanzure> django-celery is perfectly fine for spawning workers 14:03 < brownies> if he uses bundler, then... it's the same thing. same reasons. 14:03 < nmz787> :/ 14:03 < nmz787> huh? 14:03 <@kanzure> brownies: he doesn't use bundler, this is a python project without any dependency management 14:03 <@kanzure> it's basically a blob of broken code on my system 14:04 < nmz787> right, it requires django, django-celery, and biopython 14:04 < nmz787> i think thats it 14:04 <@kanzure> brownies: maybe you know a way to convince him to actually use these tools 14:04 <@kanzure> brownies: because i seem to be failing 14:04 <@kanzure> yes but which version of those 14:04 < nmz787> should it always be 'the newest' 14:04 < nmz787> shouldn't it* 14:04 <@kanzure> i'd rather use the exact versions that you're using 14:05 <@kanzure> you don't want my system-wide version of biopython 'cause that could be anything 14:06 < nmz787> do you use debian? 14:07 < brownies> nmz787: you have to understand that the "app" you've made has to run on many different machines 14:07 < nmz787> sure 14:07 < brownies> even if it's *only* you *ever* working on it, you still need to account for dev vs. staging vs. production 14:07 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has joined ##hplusroadmap 14:07 < nmz787> hrmm 14:07 < brownies> so, how do we maintain a uniform environment across machines? we do that by specifying "this app *requires* these packages, at this version, running on this python" 14:08 < brownies> nmz787: using different versions of packages in different places may break the app, so this is an attempt to ensure some degree of portability 14:08 < nmz787> sure 14:08 < nmz787> but it seems ugly 14:08 <@kanzure> and i don't want to debug this if 50% of my packages are wrong 14:08 < brownies> you are basically saying "this is what it was built and tested with; if you recreate that environment on your system, it should also work" 14:08 < nmz787> i'd rather the app be one file 14:08 < brownies> ... 14:08 < nmz787> sure 14:08 <@kanzure> what does that have to do with it? 14:08 <@kanzure> it's already like 50 files 14:08 < nmz787> right 14:09 < nmz787> i would love if it was 1 file 14:09 < nmz787> just like the old days 14:09 < brownies> i would love if i had a hot tub in my backyard, but... looking out there, i still don't see one. 14:09 < nmz787> well so what do i do ? 14:09 < nmz787> find the versions, then what 14:09 <@kanzure> curl -s https://raw.github.com/brainsik/virtualenv-burrito/master/virtualenv-burrito.sh | bash 14:09 < brownies> then you specify them, written down in a format that other people can understand 14:09 <@kanzure> mkvirtualenv enzyme-filter 14:09 <@kanzure> workon enzyme-filter 14:09 < brownies> other people have built tools to automatically understand such files in specific formats 14:09 <@kanzure> pip install thethingyouwant 14:09 < brownies> for ruby, people use bundler. for python, people use pip. 14:09 < brownies> pip + virtualenv. 14:10 <@kanzure> pip freeze > requirements.txt 14:10 < nmz787> kanzure: do i run those in the git repo dir? 14:10 <@kanzure> git add requirements.txt; git commit -m'yo dog, requirements' 14:10 <@kanzure> i suggest running the "pip freeze > requirements.txt" and git commands in the repo yes 14:11 -!- sylph_mako [~mako@168.3.252.27.dyn.cust.vf.net.nz] has joined ##hplusroadmap 14:11 <@kanzure> you can also manually write requirements.txt in the format "django==5.5.5" on each line 14:11 <@kanzure> you can check which version you have with uh.. um. 14:12 <@kanzure> ah, with pip freeze | grep django 14:12 <@kanzure> well, actually, it's Django 14:13 <@kanzure> for instance, my system version of django is apparently 1.2.3 14:13 <@kanzure> so i'd write "Django==1.2.3" 14:13 < nmz787> well, I should upgrade Django then to the latest 14:13 < nmz787> before doing this 14:13 < nmz787> right? 14:14 <@kanzure> no 14:14 <@kanzure> because it might not work on latest django 14:14 < brownies> no... and hopefully you see *why* you shouldn't 14:14 < nmz787> but latest django is prob more secure 14:14 <@kanzure> latest django probably doesn't work with your code 14:14 < nmz787> so upgrade, take care of (possibly) broken stuff 14:14 <@kanzure> and even if it does, you have zero unit tests, so you can't be sure that it works 14:15 < nmz787> add to virtualenv burrito thing 14:15 < nmz787> i didnt unit test this anyway 14:15 <@kanzure> no don't upgrade django 14:15 <@kanzure> whateveryoudo don't do that. 14:15 < nmz787> ok 14:16 * brownies facepalms 14:16 < brownies> newer version != better 14:18 < nmz787> You asked me to pull without telling me which branch you 14:18 < nmz787> want to merge with, and 'branch.master.merge' in 14:18 < nmz787> your configuration file does not tell me, either. Please 14:18 < nmz787> specify which branch you want to use on the command line and 14:18 < nmz787> try again (e.g. 'git pull '). 14:18 < nmz787> See git-pull(1) for details. 14:18 < nmz787> so git pull didnt work 14:18 < brownies> eh? what did you run? 14:18 < nmz787> git pull 14:18 < brownies> why did you git pull? 14:18 <@kanzure> i committed things and pushed to the server 14:19 <@kanzure> and i think nmz787 also committed things 14:19 < nmz787> to prepare to commit and push req.ytxt 14:19 <@kanzure> anyway, i think a merge is probably okay 14:19 < nmz787> wait, no 14:19 <@kanzure> git pull origin master 14:19 < nmz787> why should it merge? 14:19 <@kanzure> because there are two different HEADs right now 14:19 < nmz787> why? 14:19 <@kanzure> there's mine (on the server) and yours 14:19 < nmz787> how did that happen? 14:19 <@kanzure> i committed in git 14:19 < brownies> just... git pull origin master 14:19 <@kanzure> and then pushed to the server 14:19 < brownies> you guys should learn about branches -_- 14:20 <@kanzure> well, i thought my changes should be on master 14:20 < nmz787> wtf is that? 14:20 < nmz787> why isn't this easier? 14:20 <@kanzure> this is very easy 14:20 <@kanzure> "git pull origin master" 14:20 < nmz787> no its not 14:20 < nmz787> why did we ever get off the same branch or master or head 14:20 < nmz787> etc 14:21 <@kanzure> you are still on master 14:21 < nmz787> it didnt seem to know that 14:21 <@kanzure> type git status and it will tell you that you're on master 14:21 <@kanzure> "git pull" was warning you that there were upstream changes 14:21 <@kanzure> sometimes you don't want an automerge to happen 14:21 <@kanzure> this is not one of those times 14:22 < nmz787> upstream changes?? 14:22 < nmz787> you deleted files 14:22 <@kanzure> yeah. and then i pushed my commits to the server ("upstream"). 14:22 < nmz787> so why didn't pull work? because I already said commit on req.txt? 14:22 <@kanzure> so, let's imagine the server didn't exist for a moment 14:22 <@kanzure> no 14:22 <@kanzure> pull didn't work because it was warning you that there were changes on the remote git repository 14:22 < nmz787> that's WHY i was pulling 14:23 < nmz787> because I knew there were changed 14:23 < nmz787> changes 14:23 <@kanzure> it was also telling you that to complete that operation you should type "git pull origin master" 14:23 < brownies> git pull didn't work because it wasn't specific enough 14:23 < brownies> never type "git pull" and nothing else 14:23 <@kanzure> well.. i do "git pull" all the time. it's really an alias for two other operations. 14:24 <@kanzure> "git pull" runs "git fetch" and then "git merge" 14:24 <@kanzure> "git fetch" updates your local repo's knowledge of some remote branch, plus pulls in those commits, without touching your branches 14:24 < brownies> once you have multiple remotes and multple branches, it becomes a pretty stupid idea 14:24 < nmz787> emote: skipping bad filename enzymeFilter/media/jszip/jszip/docs/ZIP spec.txt 14:24 <@kanzure> nmz787: that's ikiwiki printing out a statement, you can ignore that 14:24 -!- EnLilaSko- [~Nattzor@m77-219-191-209.cust.tele2.se] has quit [Quit: - nbs-irc 2.39 - www.nbs-irc.net -] 14:24 <@kanzure> ikiwiki generates a private wiki based on the files, but in this case we don't care 14:25 <@kanzure> and ikiwiki- for some unholy reason- hates filenames with spaces 14:25 < nmz787> ok 14:25 < nmz787> so I added the pip freeze>requirements.txt 14:26 <@kanzure> looking.. 14:26 <@kanzure> ok looks good 14:27 <@kanzure> so now i'm running: mkvirtualenv enzyme-filter; workon enzyme-filter; pip install -r requirements.txt 14:27 <@kanzure> Downloading/unpacking BeautifulSoup==3.1.0.1 (from -r requirements.txt (line 1)) 14:27 <@kanzure> Could not find a version that satisfies the requirement BeautifulSoup==3.1.0.1 (from -r requirements.txt (line 1)) (from versions: ) 14:27 <@kanzure> um, which version of python is this? 14:28 < nmz787> sounded like that should be in req.txt! 14:28 < nmz787> i dunno 14:28 < nmz787> 2.X 14:28 <@kanzure> hmm 14:28 <@kanzure> python --version 14:28 < nmz787> 2.6.6 14:28 < nmz787> i dont think that should matter 14:28 <@kanzure> really? i'm using 2.6.6 too 14:28 <@kanzure> so um, how come it can't get that version of BeautifulSoup? 14:29 <@kanzure> are you even using BeautifulSoup in this project 14:29 < nmz787> just change beautiful soup to ver 3.2.0 14:29 < nmz787> yeah, i was 14:30 <@kanzure> but why does requirements.txt say 3.1.0.1 14:30 <@kanzure> maybe you installed it manually from them? 14:30 <@kanzure> ok anyway, i'm trying 3.2 14:30 < nmz787> because i just told you to change it 14:31 < nmz787> yeah i dont think it will matter 14:31 < nmz787> 3.2 is what my ubuntu has 14:31 < nmz787> so you should be able to dl that 14:31 <@kanzure> btw: i'm not on ubuntu (i'm on debian), which is more evidence of why pip is a good idea for isolating the web app from which system it's running on 14:32 < nmz787> yeah i know you're on deb 14:32 < nmz787> which is weird, because the jbei machine is deb, which is where i did pip freeze 14:33 < nmz787> well i don't think its using beautifulsoup 14:33 < nmz787> i grepped the entire dir, and its only in req.txt 14:33 <@kanzure> Could not find any downloads that satisfy the requirement iotop==0.4 (from -r requirements.txt (line 11)) 14:33 < nmz787> dunno what that is 14:33 < nmz787> i only did what you told me! 14:34 < nmz787> any deps will be at the top of views.py 14:35 <@kanzure> still installing things. 14:35 < nmz787> i doubt you really need anything in req.txt 14:36 <@kanzure> biopython isn't listed 14:36 < nmz787> yeah 14:36 <@kanzure> which means that biopython isn't installed on that system 14:36 < nmz787> it is 1.54 14:36 < nmz787> yeah it is 14:36 < nmz787> i just imported it into python 14:36 < nmz787> on that sys 14:36 -!- minimoose [~minimoose@pool-173-75-216-239.phlapa.fios.verizon.net] has joined ##hplusroadmap 14:37 <@kanzure> sure.. but pip looks at your installed libraries, and "pip freeze" dumps the full list. so somehow biopython is not installed. anyway, i'm adding it to requirements.txt 14:37 <@kanzure> and 1.54 seems to exist, so it's installing now. 14:38 < brownies> btw... sad news from the monkey front http://www.nytimes.com/2012/08/30/science/low-calorie-diet-doesnt-prolong-life-study-of-monkeys-finds.html?_r=1 14:38 <@kanzure> Could not find any downloads that satisfy the requirement reportbug==4.12.6 (from -r requirements.txt (line 18)) 14:39 -!- Falfe [~not@c83-251-81-162.bredband.comhem.se] has quit [Ping timeout: 260 seconds] 14:39 <@kanzure> biopython fails to install with: error: option --single-version-externally-managed not recognized 14:40 < nmz787> python-biopython shows up as installed in dpkg 14:40 < nmz787> pip must be broken 14:40 < nmz787> just do apt-get install python-biopython 14:40 < nmz787> i don't think req.txt is helping at all 14:40 < nmz787> no one uses reportbug 14:41 <@kanzure> you had to do "pip freeze" because you didn't have a pip file from before; most people just write their requirements file as they go along 14:41 <@kanzure> so naturally "pip freeze" included a bunch of things that were installed system-wide 14:41 < nmz787> right 14:41 < nmz787> which is what /you/ said to do 14:41 <@kanzure> yep 14:42 < nmz787> i told you before that, django, biopython, and django-celery 14:42 < nmz787> if you have a vanilla debian install, if my shit is broken it deserves to be 14:42 < nmz787> it deserves to be fixed to be compatible with latest 14:42 < nmz787> IMO 14:42 * nmz787 shrugs 14:43 <@kanzure> before fixing things you have to make sure it works with the latest versions 14:43 <@kanzure> i mean, with the versions that you previously had working 14:43 <@kanzure> my debian is far from vanilla- it's customized like crazy, so trusting it is a terrible idea, and the project isn't far enough along to justify a custom vm 14:43 <@kanzure> jrayhawk would suggest a chroot but, again, that would be a few gigs for this :P 14:44 < brownies> pip is not exactly a stellar piece of software 14:44 <@kanzure> no 14:45 * kanzure is still installing things 14:45 < nmz787> geex 14:45 < nmz787> what are you installing? 14:46 <@kanzure> python-dev, for numpy 14:46 <@kanzure> because numpy has to compile things when it installs via an egg 14:46 < nmz787> why don't you just apt-get all that? 14:46 < chris_99> does anyone know about biotic interactions with yeast cells? 14:46 <@kanzure> nmz787: i already have numpy installed, but not in my virtualenv 14:46 < nmz787> chris_99: i am about to have a biotic reaction with some yeast 14:46 < nmz787> when i drink this beetr 14:46 < nmz787> beer 14:47 < chris_99> hehe 14:47 < nmz787> kanzure: and why doesnt that pass through? 14:47 <@kanzure> because that's not how virtualenvs work 14:47 <@kanzure> virtualenvs separate your code from my system installation 14:51 < nmz787> right, but.... umm 14:51 < nmz787> don't you already have numpy? 14:51 <@kanzure> i do have numpy. i don't have numpy in enzyme-filter's virtualenv. 14:52 <@kanzure> i think i have at least six versions of numpy on my system haha 14:52 <@kanzure> okay, so i've pushed a branch (since you seem to hate when i put things into master..) 14:52 <@kanzure> git fetch origin requirements 14:52 <@kanzure> git checkout origin/requirements 14:52 <@kanzure> then "git log" to see what i did 14:53 <@kanzure> to get back to master you would type "git checkout master" or "git branch -a" to see a list. 14:53 <@kanzure> oh oops i meant "git checkout remotes/origin/requirements" 14:53 < nmz787> what? 14:53 < nmz787> no, i hate branches! 14:53 < nmz787> :D 14:54 < nmz787> ugh 14:55 <@kanzure> so anyway, i think this should be in master, so you can pull master now with my changes 14:55 < brownies> this is kinda funny 14:55 <@kanzure> brownies: ? 14:55 < brownies> largely because i don't have to suffer through it, and i can just watch the action =P 14:56 <@kanzure> well, now the app can presumably work wherever, including on my system 14:56 <@kanzure> nmz787: so you wanted help with something in particular on this right? or just getting it installable? 14:56 < nmz787> hmm, ok 14:56 < nmz787> well its installed and running now, so the former 14:57 < nmz787> enzymeFilter/views.py is where 95% of stuff happens 14:57 <@kanzure> (enzyme-filter)kanzure@pikachu:~/local/enzyme-filter$ python manage.py runserver 0.0.0.0:4030 14:57 < nmz787> the rest is HTML templates and javascript and CSS crap 14:57 <@kanzure> Error: No module named djangoApps.enzymeFilter 14:57 <@kanzure> the heck is djangoApps ?? 14:57 < nmz787> oh, the folder that is now enzyme-filter 14:58 <@kanzure> waait why is stuff like blastGene() in views.py 14:58 < nmz787> i like single file programs 14:59 <@kanzure> ok this is why it's slow 14:59 < nmz787> y? 14:59 -!- srangewarp [~strangewa@c-76-25-200-47.hsd1.co.comcast.net] has joined ##hplusroadmap 14:59 -!- strangewarp [~strangewa@c-76-25-200-47.hsd1.co.comcast.net] has quit [Disconnected by services] 14:59 <@kanzure> BLAST should not be in your views file 14:59 -!- srangewarp is now known as strangewarp 14:59 < nmz787> huh? 14:59 <@kanzure> it should be a separate task that the web app spawns 14:59 < nmz787> why not 14:59 <@kanzure> or it should be in your controller 14:59 <@kanzure> because views are supposed to be about rendering silly little templates 14:59 < nmz787> MVC blah blah, tl;dr that class 14:59 <@kanzure> oh god. 14:59 < nmz787> :D 14:59 <@kanzure> i give up 15:00 < nmz787> well that just calls BLAST anyway 15:00 < nmz787> and only if you ask it to 15:00 < nmz787> its not what is slow 15:00 <@kanzure> yes it is.. BLAST is blocking 15:00 < nmz787> no 15:00 <@kanzure> so that means your http request waits until it is done 15:00 < nmz787> BLAST doesnt get called 15:01 < nmz787> i'm pretty sure that is forked by biopython to the C version of NCBI tools 15:01 < nmz787> but even when I used that, it didn't make things slow 15:02 < nmz787> i could never figure out how to get celery and the rabbitmq server setup as daemons either 15:02 < nmz787> which kinda makes me not like celery 15:02 < nmz787> but i could just be dumb 15:03 <@kanzure> i think on ubuntu you just type "start celeryd" and make sure it's in your /etc/init/ folder 15:03 <@kanzure> something like that 15:04 < nmz787> so the stuff i need help with is the process*** at the bottom 15:04 < nmz787> they aren't using the ORM efficiently i think 15:04 <@kanzure> processFAA? 15:04 < nmz787> or rather, I shouldn't be using the ORM, because what I think I really want is raw SQL bulk inserts 15:05 < nmz787> FAA, FNA, and XLS 15:05 <@kanzure> first thing i suggest is moving the import statements out of the function.. 15:05 <@kanzure> so you're importing data from these files? 15:05 < nmz787> well that's not the problem 15:05 < nmz787> yes 15:07 < nmz787> upload receives the files, then when all 3 are present they get processed 15:07 < brownies> o.O 15:07 <@kanzure> why are you uploading files? why not just run a script on files on the server 15:08 < nmz787> first there are some .py files in the root dir of the project that need run, to set up all the different ontology links (i.e. how to map one to the other) 15:08 < nmz787> because JGI didnt hire me 15:08 < nmz787> JBEI did 15:08 < nmz787> and JGI has the server 15:10 <@kanzure> oh they didn't let you on the server? 15:11 < nmz787> well, no, they don't want me messing with pip and virtualenv on their servers 15:13 <@kanzure> they sound lame 15:14 < nmz787> psh 15:14 < nmz787> they sequence DNA, not do synBio 15:14 -!- _sol_ [~Sol@c-174-57-58-11.hsd1.pa.comcast.net] has joined ##hplusroadmap 15:15 -!- sylph_mako [~mako@168.3.252.27.dyn.cust.vf.net.nz] has quit [Ping timeout: 252 seconds] 15:16 <@kanzure> nmz787: so, you think that the sql inserts are slow, or parsing the file? 15:16 <@kanzure> if the file is a few hundred megs, i think you should do a stream parser or something 15:16 <@kanzure> where you load a line at a time (or w/e small fragment makes sense) 15:19 <@kanzure> brownies: btw.. what do you think? http://diyhpl.us/wiki/diybio/faq/books/ try the sambrook one 15:22 < nmz787> the ORM stuff 15:22 <@kanzure> have you profiled it? 15:22 < nmz787> i can read the file and for loop through each line very fast 15:23 <@kanzure> you can add statements to get datetime and then check datetime again and print the difference between statements 15:23 < nmz787> not since last year, i don't remember 15:23 < nmz787> but i am sure its the ORM 15:23 <@kanzure> ok. i see. yeah, so some of the ORM statements might be querying all objects for some reason, or it might be the transaction committing that is sucking up time 15:26 < nmz787> i wrote it like i thought django wanted me to... as soon as I added something to the ORM, I retrieved it from there (even though it may have still been in a local var) 15:27 <@kanzure> oh, no, you don't need to do that 15:27 -!- sylph_mako [~mako@14.24.252.27.dyn.cust.vf.net.nz] has joined ##hplusroadmap 15:27 < nmz787> basically I have that geneObject 15:28 < nmz787> which is getting built from the file data 15:29 < nmz787> if that geneObject was just some list or something that got built... then the whole thing inserted in one SQL call... i think that would fix things 15:29 < nmz787> rather than geneObject being an ORM object... with everytime I touch it, it calling the db 15:34 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has quit [Quit: Leaving] 15:35 < brownies> kanzure: much improved. i'll pick one up, thanks. 15:39 < nmz787> kanzure: do you have any recommendations on where to start, or things i should do to break this down 15:39 < nmz787> so i can start fixing it, properly 15:40 < nmz787> kanzure: FYI, my mentor, Patrik, originally just wanted scripts to run on his own box 15:40 < nmz787> but also foresaw that others would find utility in it 15:40 < nmz787> so I said 'O I know some Django!!!' 15:41 < nmz787> and got swept into CSS and javascript uploaders 15:42 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has joined ##hplusroadmap 15:43 -!- skorket [~skorket@cpe-24-58-232-122.twcny.res.rr.com] has joined ##hplusroadmap 15:45 -!- sylph_mako [~mako@14.24.252.27.dyn.cust.vf.net.nz] has quit [Ping timeout: 245 seconds] 15:49 -!- jmil [~jmil@hive76/member/jmil] has joined ##hplusroadmap 15:51 < skorket> evening all 15:52 < nmz787> hi 15:55 <@kanzure> nmz787: i suggest starting with unit tests, i guess. 15:56 < nmz787> what does that mean in this case? 15:56 < nmz787> generally i've just been uploading files and watching where it breaks 15:56 <@kanzure> well, this way you can make sure that code still works 15:56 < nmz787> or adding in printlines to things i know are wonky 15:57 <@kanzure> let's say that you have a function that does a multi-step calculation based on input 15:57 <@kanzure> you would write a unit test to see if the function still works with previous inputs and if those inputs still cause it to return the right answer 15:57 <@kanzure> https://docs.djangoproject.com/en/dev/topics/testing/ 15:57 <@kanzure> in your case there's a file enzymeFilter/tests.py 15:57 <@kanzure> which is all lonely :( 15:58 <@kanzure> i think you run it with "python manage.py test" 15:59 < nmz787> that file is practically empty 15:59 <@kanzure> yep.. you gotta write tests to confirm that your stuff works / still works with new updates 15:59 <@kanzure> *new changes 16:00 < chris_99> couple of interesting courses starting soon - https://www.coursera.org/category/biology 16:00 <@kanzure> i think this is a good place to start because you can have a re-usable way to determine which parts are breaking 16:00 < nmz787> hmm 16:02 -!- sylph_mako [~mako@14.24.252.27.dyn.cust.vf.net.nz] has joined ##hplusroadmap 16:08 < brownies> don't get too bogged down into TDD/BDD dogma 16:08 < brownies> just think of it as automating the tedious tasks you're doing right now to test your app 16:09 < nmz787> TDD/BDD? 16:10 < nmz787> well right now i open the webpage, upload a file, and wait for it to break, then fix/screw with... and repeat 16:10 < nmz787> i'll have to read that testing doc 16:11 < nmz787> new zealand is sposed to be pretty sweet 16:11 < nmz787> whoops 16:12 <@kanzure> yeah you shouldn't have to manually do any of that 16:12 <@kanzure> i'd be far too lazy to do that more than uh, once 16:13 < nmz787> oh 16:13 < nmz787> i'm the otherway, too lazy to learn the (right?) way 16:13 < nmz787> its not really too much though, click a few things, wait for breakage 16:14 <@kanzure> you're using "python manage.py runserver" right? 16:14 < nmz787> figuring out how to call a function in django from outside sounds PITA 16:14 < nmz787> yeah 16:15 <@kanzure> you just import your project and then call it 16:16 <@kanzure> https://docs.djangoproject.com/en/dev/topics/testing/#django.test.client.Client.post 16:16 <@kanzure> from myapp.models import Animal, etc. 16:25 -!- jmil [~jmil@hive76/member/jmil] has quit [Read error: Connection reset by peer] 16:25 -!- jmil [~jmil@hive76/member/jmil] has joined ##hplusroadmap 16:27 -!- augur [~augur@129-2-129-33.wireless.umd.edu] has quit [Remote host closed the connection] 16:27 -!- skorket [~skorket@cpe-24-58-232-122.twcny.res.rr.com] has quit [Ping timeout: 248 seconds] 16:36 -!- skorket [~skorket@cpe-24-58-232-122.twcny.res.rr.com] has joined ##hplusroadmap 16:48 <@kanzure> so what is the "methods in molecular biology" series from springer? 16:48 <@kanzure> is this a journal, or is it more like updated textbook editions? 16:49 <@kanzure> "Methods in Molecular Biology is a book series from publisher Humana Press that presents biomedical and life science research methods and protocols. The book series was introduced by series editor Dr. John Walker [disambiguation needed] in 1983 and provides step-by-step instructions for carrying out experiments in a research lab.[1]" 16:49 <@kanzure> "As of April 2011, 867 volumes had been published in the series. The protocols are also available online at Springer Protocols." 16:50 < nmz787> its mainly protocols i think 16:50 < nmz787> like once a technique gets pretty good 16:50 <@kanzure> there's 126 books in the 2012 series 16:50 <@kanzure> http://www.springerlink.com/content/1064-3745/copyright/2012/ 16:51 <@kanzure> 933 volumes 16:51 < nmz787> they compile a ton of original/influential/top-game papers, along with a good review tying it all together in the beginning 16:51 -!- hifrog [~swamp@p5B16D6F4.dip.t-dialin.net] has quit [Quit: Love Everything! Resistance is Futile.] 16:51 <@kanzure> so, "volume 2" of a book from this series will probably contain different protocols? 16:51 < nmz787> did you see that ACS Synthetic Biology comment i made early today? 16:51 < nmz787> they dont make it in paper format 16:51 <@kanzure> huh. 16:52 < nmz787> so i can't really even be too much of a fanboy 16:52 < nmz787> i woulda got the backissues 16:52 < nmz787> :) 16:53 -!- danielcc [62471046@gateway/web/freenode/ip.98.71.16.70] has joined ##hplusroadmap 16:53 <@kanzure> danielcc: hi 16:53 -!- nmz787 [~nathan@ool-45792f2b.dyn.optonline.net] has quit [Quit: Leaving.] 16:53 <@kanzure> wait don't leave 16:53 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has joined ##hplusroadmap 16:53 <@kanzure> ok there you are 16:54 <@kanzure> danielcc: so you were saying you think you can get a 20 um spot size? 16:55 < danielcc> Brian: someone has offered me a 100W lazer for the project, though I tried just scratching borosilicate glass and think I may have gotten into the ballpark of the 20um goal 16:55 < danielcc> I will probably go this weekend and take a peak with an afm 16:56 < nmz787> oh its a 10.6 micron IR? 16:56 < nmz787> CO2? 16:57 < danielcc> I just got off the phone with a sr bioengineer and dear friend and he brought up a point that the glass may be to jagged if mechanically scratched(I guess we will see after the afm pictures are back) but I was curious if anyone knew of a chemical process to smooth things out 16:57 < danielcc> I was thinking something like a sulfuric acid and h2o2 bath 16:57 < danielcc> but I am but a humble EE 16:57 < nmz787> HF 16:58 < nmz787> but its nasty 16:58 < danielcc> I dont want to deal in hf if i dont have too... 16:58 < danielcc> I've seen my fair share of it in the SiC business for a lifetime 16:59 <@kanzure> glass is just an ideal material, but we will probably be prototyping with pdms too 16:59 < danielcc> my main question is does anyone here know of any modifications to the experiment before I go hit up the afm so I am not wasting microscope time? 17:00 <@kanzure> so you just drew a line with a 100W laser? 17:00 <@kanzure> nmz787 has a test image that he used before that tried out a number of patterns 17:00 <@kanzure> some svg file somewhere? 17:01 <@kanzure> nmz787: where is microfluidic-chip-test-1.svg ? 17:02 < danielcc> no I scored the glass with a very finely etched tungsten carbide tip 17:02 <@kanzure> wait, no that wasn't the file 17:04 <@kanzure> http://diyhpl.us/~bryan/papers2/diybio/mccorkle_tomkins-tinch_microchannels.svg 17:04 <@kanzure> but this was for playing with a laser 17:07 < danielcc> I am assuming also the accuracy is decent enough with this bit... 17:07 -!- eudoxia [~eudoxia@r186-54-251-164.dialup.adsl.anteldata.net.uy] has joined ##hplusroadmap 17:07 < danielcc> sorry for the delay but someone just gave me a ring 17:07 < nmz787> well are you using CO2 laser? 17:08 <@kanzure> no he's using a tungsten carbide tip 17:08 < eudoxia> hey, strangewarp, you unfollowed Dale just before shit matched coordinates with the fan, now he's arguing with Max More 17:08 < nmz787> ok 17:08 < nmz787> oh 17:08 < nmz787> is the afm free? 17:09 -!- augur [~augur@208.58.5.87] has joined ##hplusroadmap 17:10 <@kanzure> nmz787: so he's wondering if he should do anything else to the sample before sending it off to afm 17:12 < danielcc> The AFM is in the lab I used to work at 17:13 < danielcc> the guys offered to let me come in and run it for old time sake assuming this project was not for profit 17:13 < danielcc> I however do not want to over extend my welcome, if you know what I mean 17:16 < nmz787> I think glass may crack if not cooled... 17:16 < nmz787> there is that kickstarter 17:16 < nmz787> http://www.kickstarter.com/projects/fsl/affordable-20x12-laser-cutter-engraver-assembled-i 17:16 -!- eudoxia [~eudoxia@r186-54-251-164.dialup.adsl.anteldata.net.uy] has quit [Ping timeout: 252 seconds] 17:17 < nmz787> place an array of TEC peltiers under there 17:17 < nmz787> the floor drops out 17:17 < nmz787> they are USA company 17:17 < nmz787> other products are chinese lasers upgraded with their electronics 17:20 < danielcc> I'm just wanting to think simplicity... A fella from work showed me this: http://www.instructables.com/id/Pocket-laser-engraver/step2/Rip-apart-the-DVD-Roms/ 17:20 < nmz787> yeah blu-ray or IR is the only way to go really 17:20 < nmz787> excimer is what you want tho, and they're dam expensive 17:20 < nmz787> DRIE might do it 17:21 < danielcc> apparently the heads should be set to roughly what we need, but I was wanting to shoot kinda high with the borosilicate glass 17:21 < nmz787> http://en.wikipedia.org/wiki/Deep_reactive-ion_etching#Applications 17:21 < nmz787> g2g 17:21 < nmz787> would like to talk again later 17:22 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has quit [Quit: Leaving.] 17:23 -!- chris_99 [~chris_99@unaffiliated/chris-99/x-3062929] has quit [Quit: Leaving] 17:25 < danielcc> I feel certain that this deep etching is not going to be achievable in most home labs... though I do have a deep vacuum system laying around if anyone in the SC region wants to help put together a setup 17:25 < danielcc> I have actually used this method for making some microcantilevers 17:31 -!- tashoutang [~tata@pc131090206.ntunhs.edu.tw] has joined ##hplusroadmap 17:32 < tashoutang> morning 17:34 <@kanzure> hi. 17:38 < tashoutang> hi,@K 17:38 <@kanzure> tashoutang: press k then press tab 17:38 < tashoutang> kanzure 17:38 < tashoutang> my msn is michaeltashoutang@hotmail.com 17:38 -!- soylentbomb [~k@d149-67-118-140.col.wideopenwest.com] has quit [Quit: Leaving] 17:38 <@kanzure> my client does not notify me when someone says "K", but it does notify me for "kanzure" 17:39 < tashoutang> oK 17:39 < tashoutang> my skype is tashoutang-essan 17:39 <@kanzure> my contact details are on here: http://heybryan.org/ 17:45 < tashoutang> OK 17:45 < tashoutang> :) 17:46 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has joined ##hplusroadmap 17:57 <@kanzure> brownies: https://github.com/soundcloud/large-hadron-migrator 17:57 <@kanzure> "The basic idea is to perform the migration online while the system is live, without locking the table. In contrast to OAK and the facebook tool, we only use a copy table and triggers." 17:58 < brownies> kanzure: interesting! thanks 17:58 <@kanzure> so it looks like it just copies a giant million-row table to a temporary table and slowly migrates things over? 17:59 < brownies> although the fact that they assume the presence of an :id column is... stupid. 17:59 <@kanzure> how does this work in production when some of your users are migrated and others aren't? 17:59 <@kanzure> say your migration takes two hours.. 17:59 < brownies> not sure, tbh, what they're doing. 17:59 <@kanzure> i guess you would write your code to check whether or not the migration is completed?? 17:59 <@kanzure> e.g. in controller methods 18:00 < brownies> no, that would make it stupid and useless 18:00 < brownies> also, this already makes it mostly useless... "Lhm currently only works with MySQL databases" 18:00 <@kanzure> "TableMigrator does essentially the same thing, but more intelligently. First, we create a new table like the original one, and then apply one or more ALTER TABLE statements to the unused, empty table. Second we copy all rows from the original table into the new one. All this time, reads and writes are going to the original table, so the two tables are not consistent" 18:00 <@kanzure> "Finally, we acquire a write lock on the original table before copying over all new/changed rows, and swapping in the new table." 18:00 <@kanzure> https://github.com/freels/table_migrator (twitter's similar thing) 18:01 <@kanzure> "The solution to find updated or new rows is to use a column like updated_at (if a row is mutable) or created_at (if it is immutable) to determine which rows have been modified since the copy started. These rows are copied over to the new table using REPLACE" 18:01 < brownies> yeah, this all basically the obvious solution. 18:01 -!- danielcc [62471046@gateway/web/freenode/ip.98.71.16.70] has quit [Ping timeout: 245 seconds] 18:02 <@kanzure> "The default method (:multi_pass => true) copies over changed rows in a non-blocking manner multiple times until we can be reasonably sure that the final sweep will take very little time. The last sweep is done within the write lock, and then the tables are swapped. The length of time taken in the write lock is extremely short, hence the claim zero-downtime migration." 18:02 <@kanzure> yep ok 18:02 <@kanzure> i thought it would be something more interesting than that 18:03 < brownies> eh, what else can you do? 18:03 < brownies> i like that... classic untrained-engineer idiocy 18:03 <@kanzure> i don't know, something that would deserve the name 'large hadron' :) 18:03 < brownies> "the downtime is small, so we called it zero." 18:03 <@kanzure> haha 18:03 < brownies> morons. 18:03 <@kanzure> lim downtime_t with a subscript => 0 18:04 <@kanzure> hahah. 18:04 <@kanzure> "This method will not propagate normal DELETESs to the new table if they happen during/after the copy. In order to avoid this, use paranoid deletion, and update the column you are using to find changes appropriately." 18:04 <@kanzure> you should paranoid deletion anyway, but that's amusing 18:04 < brownies> haha 18:05 < brownies> i'm implementing paranoid deletion later today, from scratch 18:05 < brownies> because (surprise!) the author of the gem is an idiot 18:05 <@kanzure> oh, why? 18:05 <@kanzure> oh hm 18:05 <@kanzure> which gem were you using? 18:05 < brownies> if you use paranoid deletion, you can't use :default_scope 18:05 <@kanzure> i've tried a few, but i think i used rails3_acts_as_paranoid most recently 18:05 < brownies> let me take a look 18:05 < brownies> i surveyed the choices a while back, and decided they were all idiots 18:05 <@kanzure> sounds fun 18:06 <@kanzure> you know, out of all the times i've implemented paranoid delete, i don't recall ever making something that would periodically go through and actually delete really really old stuff 18:07 -!- delinquentme [~asdfasdf@c-71-236-101-39.hsd1.pa.comcast.net] has joined ##hplusroadmap 18:07 < delinquentme> afternoon all! 18:07 < brownies> oh, hmmm 18:07 < brownies> kanzure: the README doesn't make mention of this :default_scope issue ... 18:07 < brownies> kanzure: which means, either it's fixed, or the author is an asshole. 18:07 <@kanzure> do a pull request :P 18:08 <@kanzure> with the README updated, not the solution (unless you have that too) 18:09 < brownies> bleh ... and he doesn't seem to manage his pull requests competently 18:09 <@kanzure> that's lame 18:09 <@kanzure> someone should make a script that looks at your gems' repositories on github and rates how likely they are to be idiots 18:09 <@kanzure> "Your rails app is made up of 43% idiocy, congratulations!" 18:10 <@kanzure> this is easy to determine: do they have lots of open issues, pull requests? do they pull their pull requests, or do they re-commit the changes? 18:10 <@kanzure> do they mix tabs and spaces? 18:15 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has quit [Quit: Leaving.] 18:20 -!- augur [~augur@208.58.5.87] has quit [Remote host closed the connection] 18:21 < brownies> hehe 18:21 < brownies> yeah, that would be useful. 18:21 < brownies> the non-idiots could proudly badge their READMEs 18:21 < brownies> "i'm only 10% idiot, according to idiotrubyists.com!" 18:22 -!- yashgaroth [~f@cpe-66-27-117-179.san.res.rr.com] has joined ##hplusroadmap 18:29 < yashgaroth> kanzure: methods in molecular biology is good, definitely worth the space 18:30 -!- tashoutang [~tata@pc131090206.ntunhs.edu.tw] has quit [Ping timeout: 268 seconds] 18:37 <@kanzure> yashgaroth: ok i'll grab 'em all 18:45 -!- tashoutang [~tata@pc131090206.ntunhs.edu.tw] has joined ##hplusroadmap 18:56 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has joined ##hplusroadmap 19:04 -!- ThomasEgi [~thomas@panda3d/ThomasEgi] has quit [Remote host closed the connection] 19:06 < nmz787> ahh this is gold: http://www.youtube.com/playlist?list=PLD0444BD542B4D7D9&feature=plcp 19:07 < nmz787> About Walter + Eliza Hall Institute Movies from The Walter and Eliza Hall Institute of Medical Research 19:07 <@kanzure> yashgaroth: some of these books seem to be highly esoteric though 19:08 < yashgaroth> oh, totally 19:08 < foucist> you rails guys should be in my ridiculously awesome #offrails channel 19:08 -!- minimoose [~minimoose@pool-173-75-216-239.phlapa.fios.verizon.net] has quit [Quit: minimoose] 19:08 <@kanzure> "siRNA Techniques for Gene Silencing in Transgenic Wombats - Methods in Molecular Biology - 8th edition" 19:08 < yashgaroth> but when you gotta know about cyclooxygenases, you better hope you have vol. 644 handy 19:10 <@kanzure> cyclowhats? 19:10 <@kanzure> blaargh chrome removed --vertical-tabs https://code.google.com/p/chromium/issues/detail?id=99332#c25 19:10 < yashgaroth> hey man you've got the book now 19:11 <@kanzure> not quite yet 19:11 < nmz787> COX right yashgaroth? 19:11 < yashgaroth> yep, and one day when I'm stuck in limbo I can read up on it 19:11 <@kanzure> didn't you just experience a three month limbo? 19:12 < nmz787> he prob got to vol 643 19:12 <@kanzure> what a slacker 19:12 < nmz787> thats why he is ready for 644 19:12 < yashgaroth> not quite limbo enough to read up on COX 19:12 < nmz787> gene jokes 19:12 < nmz787> oh gosh 19:12 < yashgaroth> remembering how shitty work is, is giving me more motivation than unemployment ever did 19:12 < nmz787> genetic slurs 19:12 < yashgaroth> truncated cox 19:13 < nmz787> "heard your COX wasn't doing too well, nerd!" 19:13 < yashgaroth> hey girl I heard you're an expert in cox assays 19:13 < nmz787> lol 19:13 < nmz787> that's a good one 19:15 < brownies> chrome used to have vertical tabs? 19:15 < brownies> wild 19:16 -!- jk4930 is now known as transgenicWombat 19:19 < nmz787> "Disclaimer - Due to Chrome limitations, extensions cannot restore history from closed tabs" 19:20 <@kanzure> dafuq 19:20 <@kanzure> brownies: my religion is carefully constructed around the existence of https://addons.mozilla.org/en-US/firefox/addon/tree-style-tab/ 19:22 <@kanzure> although opera does that better 19:23 < brownies> heh yea i used to use that pretty heavily 19:23 < brownies> nowadays i just stash everything in pinboard 19:24 < nmz787> can you reduce the visibility of firefox to that of chrome? 19:24 < nmz787> kanzure: does it retain your tab history? 19:25 <@kanzure> firefox has other tab history things 19:26 < nmz787> so it doesnt do it 19:26 < nmz787> i mean clicking back 19:26 < nmz787> after restoring a tab 19:28 -!- _sol_ [~Sol@c-174-57-58-11.hsd1.pa.comcast.net] has quit [Read error: Connection reset by peer] 19:29 -!- _sol_ [~Sol@c-174-57-58-11.hsd1.pa.comcast.net] has joined ##hplusroadmap 19:32 < nmz787> so toomanytabs doesn't actually get rid of the tabs visually? 19:32 < nmz787> nor does tree-style-tab? 19:36 < nmz787> this looks like what i want https://chrome.google.com/webstore/detail/bbcnbpafconjjigibnhbfmmgdbbkcjfi 19:40 -!- transgenicWombat is now known as jk4930 19:42 < nmz787> oh, bad reviews 19:43 < nmz787> i could just have like 10 different chromium installs 19:46 < foucist> chrome is fucking shit.. i really need to stop using it as my main browser 19:46 < brownies> what alternative is there, though? 19:47 < foucist> i use Sessions Buddy extension in chrome 19:47 < foucist> brownies: firefox or safari 19:47 < brownies> eh, i try out safari every now and then, but it feels slower 19:47 < brownies> and Firefox still leaks memory everywhere 19:48 < foucist> have you tried ff this summer? 19:48 < foucist> apparently they were fixing that big time 19:48 < foucist> i haven't tried it yet 19:48 < foucist> i definitely need a more memory efficient browser than chrome 19:48 < foucist> or i need to know which version of chrome is the most memory efficient and fix it at that 19:51 -!- augur [~augur@208.58.5.87] has joined ##hplusroadmap 19:51 < brownies> i heard something about that, but i haven't tried FF in a while 19:51 < foucist> brownies: i have my swap disabled for speed reasons, but if i go over 4gigs ram it crashes my computer.. chrome is painful since it tends to use up all my memory 19:51 < foucist> of course, i tend to have loads of tabs open 19:53 < brownies> =/ 19:53 < brownies> yeah, i got in the habit of closing tabs, and instead pinboarding things that i wanted to read later 19:54 < nmz787> how do i add a sub comment to stackoverflow? 19:54 < nmz787> http://stackoverflow.com/questions/9981267/google-chrome-back-and-forward-history?answertab=oldest#tab-top 19:55 < foucist> brownies: i don't actually read anything later even if i bookmark or use readitlater/etc bookmarklets heh 19:56 < nmz787> yeah that's why i want to save my tabs, not bookmark them 19:56 < nmz787> ;) 19:56 < nmz787> the difference is, uh, retaining tab history 19:56 -!- ivan`_ is now known as ivan` 19:56 < nmz787> and the concept of loading/unloading... if i click a bookmark, i dont want to have to click around to delete it too 19:58 < foucist> i don't like bookmarks at all, i want a more of seamless logging interfce.. log all the sites and pages i went to, then when i "google" something, search my local cache first 19:58 < foucist> based off my history (not just links but actual content perhaps) 19:58 < nmz787> hmm, that would annoy me i.e slow me finding new stuff i think 19:59 < nmz787> maybe not 19:59 < brownies> foucist: yeah, neither do i, but it's a nice lie to tell myself so i don't drown in tabs 19:59 < brownies> foucist: try out historious ... might be what you're looking for 20:00 < foucist> cool 20:11 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has quit [Read error: Connection reset by peer] 20:12 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has joined ##hplusroadmap 20:27 -!- hankx7787 [181e2e48@gateway/web/freenode/ip.24.30.46.72] has joined ##hplusroadmap 20:40 < brownies> kanzure: looks like DelayedJob might be another one that's full of random unpredictable fail =/ 20:43 <@kanzure> i don't think i'm smart enough to understand noscript 20:43 <@kanzure> how do i make it inform me of things (like xss attacks it spots) without disabling javascript everywherE? 20:43 <@kanzure> *everywhere? 20:49 < delinquentme> i just dont understand http://www.gizmag.com/walking-paper-robot-elastic-bands/23877/ 20:49 < delinquentme> dude (painstakingly) makes a robot entirely out of paper and rubberbands 20:50 < nmz787> kanzure said not to follow your links 20:50 < nmz787> but i will this time 20:53 < skorket> nmz787, hows the laser cutter coming? 20:54 <@kanzure> 'gizmag' sounds like spam to me 20:55 < nmz787> gonna order a diode and power supply in the next few days 20:55 < tashoutang> http://www.panoramas.dk/mars/greeley-haven.html 20:56 < tashoutang> the view on Mars 20:56 < nmz787> looking into modelling the whole system as well, using goptical or something like it/better 20:56 < skorket> you have the steppers, drivers and hardware? 20:56 < nmz787> maybe this http://code.google.com/p/pyoptools/ 20:56 < tashoutang> they can put the blue green algae there.... 20:58 <@kanzure> firebug feels a little forced in comparison to chrome inspector 21:00 < nmz787> skorket: no not yet, going to wait til i get the laser and optics sorted out, I still need to try and find some CAD models for fenn, if I can find them, or CAD drawings of a few of the cutter components... so his model can be complete 21:00 < skorket> why do you need to simulate the optics? 21:00 -!- klafka [~textual@c-67-174-253-229.hsd1.ca.comcast.net] has joined ##hplusroadmap 21:00 < nmz787> just want to 21:01 < nmz787> it will be good to learn, as I will prob need to do something similar when I try to do single-molecule fluorescence detection 21:01 < skorket> ah, I see, yeah, cool 21:01 < nmz787> i feel its stuff like that which separates good engineering/design from just screwing around 21:02 < nmz787> I trust things with nice spec sheets more than claims of 'it works great'.... but yeah mainly because I also am totally new to optics 21:02 < skorket> What diode are you getting? 21:03 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has quit [Read error: Connection reset by peer] 21:04 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has joined ##hplusroadmap 21:04 < nmz787> like most diagrams show rays eminating from a focus, but what if my rays eminate from somewhere else... then I need the simulator 21:04 -!- nmz787 [~Nathan@ool-45792f2b.dyn.optonline.net] has quit [Read error: Connection reset by peer] 21:05 < skorket> No, I feel the same ways about some tools. Even if my project is simple, I like getting my feet wet with the EDA packages that are available 21:22 <@kanzure> hmm http://code.google.com/p/pyoptools/ 21:23 <@kanzure> so it's just a raytracer? 21:23 -!- jmil [~jmil@hive76/member/jmil] has quit [Quit: jmil] 21:28 -!- hankx7787 [181e2e48@gateway/web/freenode/ip.24.30.46.72] has quit [Quit: Page closed] 21:46 -!- yashgaroth [~f@cpe-66-27-117-179.san.res.rr.com] has quit [Quit: Leaving] 21:47 -!- jk4930_ [~jk@p57B73C7A.dip.t-dialin.net] has joined ##hplusroadmap 21:47 -!- jmil [~jmil@hive76/member/jmil] has joined ##hplusroadmap 21:48 -!- jmil [~jmil@hive76/member/jmil] has quit [Client Quit] 21:51 -!- jk4930 [~jk@p57B73587.dip.t-dialin.net] has quit [Ping timeout: 264 seconds] 21:51 -!- delinquentme [~asdfasdf@c-71-236-101-39.hsd1.pa.comcast.net] has quit [Quit: Leaving] 21:55 <@kanzure> matplotlib guy died http://mail.scipy.org/pipermail/ipython-dev/2012-August/010135.html 22:14 -!- skorket [~skorket@cpe-24-58-232-122.twcny.res.rr.com] has quit [Ping timeout: 244 seconds] 22:15 -!- skorket [~skorket@cpe-24-58-232-122.twcny.res.rr.com] has joined ##hplusroadmap 22:17 -!- drazak__ [~ahdfadkfa@199.188.72.84] has joined ##hplusroadmap 22:17 -!- drazak__ [~ahdfadkfa@199.188.72.84] has quit [Client Quit] 22:24 <@kanzure> oh geeze "那些自称“生物黑客”,并把该领域称之为“DIY生物学(DIYBio)”的狂热人士,开始进行他们自己的“合成生物学”试验项目" 22:25 <@kanzure> hahah genspace translates to "绅士空间" apparently? "gentleman space" 22:25 <@kanzure> "绅士" is gentleman i think 22:28 <@kanzure> "亿万富豪企业家文特尔" well that's one way to describe craig venter.. but i think this is wrong 22:30 < tashoutang> kanzure where did you see that? 22:31 < tashoutang> so they are very well informed!!! 22:35 <@kanzure> venter is not a billionaire 22:35 <@kanzure> to my knowledge. 22:35 <@kanzure> i think his networth is south of $500M 23:02 <@kanzure> 2 hour python/requests scraping presentation from pycon 2012 http://youtube.com/watch?v=52wxGESwQSA 23:02 <@kanzure> but he's really super slow 23:04 <@kanzure> "Interestingly: I'm actually a guy on the team that sends back that different HTML from Google. ;) See: 55:19" 23:05 <@kanzure> oh nevermind, that's just for user agents 23:12 <@kanzure> http://www.learningjquery.com/2009/04/better-stronger-safer-jquerify-bookmarklet/ 23:17 <@kanzure> so that video in the last 40 minutes shows using twisted/gevent/promises to process scraped content. i don't see why he didn't just stick with celery? 23:18 <@kanzure> oh neat, python-requests supports async http requests --- Log closed Thu Aug 30 00:00:11 2012