--- Log opened Fri Dec 09 00:00:40 2022 03:24 -!- Malvolio [~Malvolio@idlerpg/player/Malvolio] has joined #hplusroadmap 03:56 -!- Malvolio [~Malvolio@idlerpg/player/Malvolio] has quit [Quit: recompile] 04:02 -!- Malvolio [~Malvolio@idlerpg/player/Malvolio] has joined #hplusroadmap 04:31 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has joined #hplusroadmap 04:57 -!- Molly_Lucy [~Molly_Luc@user/Molly-Lucy/x-8688804] has joined #hplusroadmap 05:07 -!- yashgaroth [~ffffffff@2601:5c4:c780:6aa0::a324] has joined #hplusroadmap 06:40 < kanzure> https://scholar.archive.org/ 06:49 < kanzure> overview of some recent work in machine learning for protein design (in particular alphafold2 and rosettafold) https://www.cell.com/trends/biochemical-sciences/fulltext/S0968-0004(22)00308-5 08:07 < kanzure> https://blog.nixbuild.net/posts/2022-03-16-lightning-fast-ci-with-nixbuild-net.html 08:07 < kanzure> https://docs.nixbuild.net/remote-builds/index.html 08:21 < kanzure> https://stripe.com/blog/fast-secure-builds-choose-two 08:40 < kanzure> "Automated reverse engineering of nonlinear dynamical systems" https://www.creativemachineslab.com/eureqa.html (i thought this page had died a while ago) 08:40 -!- srk [~sorki@user/srk] has joined #hplusroadmap 08:48 < kanzure> kinda curious to plug in something like GPT-3 into ben goertzel's opencog software-- it had a few different components, like one for symbolic reasoning, another for reinforcement learning, sensory processing, actuator control, executive control etc 08:49 < kanzure> maaku: to your knowledge has someone tried that? 08:51 < kanzure> gpt-3 would benefit from pre-1990s automated reasoning technology like symbolic reasoning/math; eg it should outsource any kind of calculation to some other system because Next Token Prediction does not seem to be good at executing on math 09:00 -!- srk [~sorki@user/srk] has quit [Quit: ZNC 1.8.1 - https://znc.in] 09:00 -!- srk [~sorki@user/srk] has joined #hplusroadmap 09:22 < kanzure> another regression tool https://fitsh.net/wiki/man/lfit 09:37 < kanzure> disease-focused non-profits should probably as a rule allocate a high percent of their budget upfront for thinking about how to speed up their results overall; i'm pretty sure this is underallocated at the moment in favor of slow basic research. 09:51 -!- lkcl- is now known as lkcl 10:32 -!- test__ is now known as _flood 12:00 -!- o-90 [~o-90@gateway/tor-sasl/o-90] has joined #hplusroadmap 12:07 < kanzure> hello o-90 12:13 < fenn> kinda long article about behavioral diversity https://www.theatlantic.com/magazine/archive/2009/12/the-science-of-success/307761/ 12:14 < fenn> "orchids" vs "dandelions" 12:15 < fenn> "it should outsource any kind of calculation" is easy enough to do by telling it to write a python program to do that 12:16 -!- o-90 [~o-90@gateway/tor-sasl/o-90] has quit [Ping timeout: 255 seconds] 12:16 < fenn> langchain is writing tools to automate this sort of thing with different models https://pypi.org/project/langchain/ 12:25 < fenn> eureqa style sybolic reasoning generates compact representations of physical laws (models of reality) but, just like how smooth curves can be expanded into taylor polynomial series, they can be approximated by sums of even simpler functions like ReLU that are easy to implement in hardware at huge scales 12:26 < fenn> on the other hand, you could imagine an AI accelerator chip containing large numbers of polynomial evaluator cores 12:27 < fenn> isn't that a GPU? 12:28 < fenn> at some point the eureqa style "compact representation with a wide variety of math primitives" runs out of space on the die to evaluate each of those primitives directly, and you have to fall back to a general purpose computer 12:40 < fenn> d'oh 12:40 < fenn> file:///C:/Users/Hod/Desktop/CML_OldWebsite/creativemachines.cornell.edu/sites/default/files/PNAS10_Amend.pdf 13:03 < kanzure> https://diyhpl.us/~bryan/papers2/bio/Artificially%20selecting%20for%20intelligence%20in%20dogs%20to%20produce%20human-level%20IQ%20within%20100%20generations%20-%202022.pdf 13:03 < kanzure> from https://twitter.com/dumbreepicheep/status/1601203082372603909 13:03 < kanzure> should be achievable in less than 600 years really 13:15 < kanzure> like if you allow for 100x generation sizes 13:20 < docl> I wonder if ai will tend to encourage more special purpose hardware. like to play the latest game with decent speed you need the right asic set 13:38 -!- codaraxis___ [~codaraxis@user/codaraxis] has quit [Ping timeout: 260 seconds] 13:41 < nmz787> docl: all the game consoles basically already do/did that 13:46 < kanzure> 13:41 < gwern> kanzure: the problem with the dog guy is not that 600 years is too long, it's that his math is completely wrong to begin with. he apparently has never heard of truncation selection or even heritability 13:46 < kanzure> 13:43 < gwern> kanzure: smart enough to write pages of matlab and an academic-looking paper, not smart enough to actually bother to read literally anything on quantitative genetics in animal breeding, apparently 13:51 < docl> nmz787: I wonder if it will intensify / what are some viable directions for it to do so 13:53 < docl> like, how would you manage a library of 1000 asics for all your games 14:58 < nmz787> maybe it'll be like how Sonic and Knuckles was a pass-through/interposer cartridge 14:58 -!- codaraxis [~codaraxis@user/codaraxis] has joined #hplusroadmap 14:59 < nmz787> actually I seem to recall a few games in that era shipping with embedded ASICs that the console basically didn't care/need to know about 15:16 < docl> interesting! I didn't realize that was a thing in the old cartridges. makes sense though 15:44 < kanzure> i am looking for a strong review article about the problems of "open-ended evolution" in articial life simulators (and our supreme lack of progress in achievement in that category) 15:44 < kanzure> and/or something that goes into different kinds of measurements that they have been testing to see if they have achieved that 16:02 < kanzure> .wik movile cave 16:02 < saxo> Article not found: https://en.wikipedia.org/wiki/Movile_cave gave 404 | Searched en for 'movile cave' | https://en.wikipedia.org/wiki/No_result_found gave 404 | Searched en for 'No result found' 16:02 < kanzure> .wik Movile Cave 16:02 < saxo> "Movile Cave (Romanian: Peștera Movile) is a cave near Mangalia, Constanța County, Romania discovered in 1986 by Cristian Lascu a few kilometers from the Black Sea coast." - https://en.wikipedia.org/wiki/Movile_Cave 17:11 < fenn> there's a bit of a goodhart's law problem. if you could measure evolutionary diversity or whatever, then you could select for ecosystems that produce evolutionary diversity, and your meta ecosystem evolution would find something that technically satisfies the criteria but is not what you wanted at all 17:13 -!- Malvolio is now known as Mabel 17:13 < fenn> in my opinion a-life (or e-life) has been uninteresting because it completely lacks any kind of developmental biology processes or embodied information (e.g. RNA) 17:14 < muurkha> this reminds me of how many atheists lost their religion on learning about parasitoid wasps 17:14 < muurkha> go forth and multiply 17:14 < muurkha> oh fuck no not like that 17:14 < muurkha> what the fuck have i done 17:15 < fenn> being god is a lot of responsibility 17:15 < muurkha> well, if that's what it takes, I'm willing to bear it 17:48 < kanzure> fenn: is the issue lack of developmental biology, or is the issue that the environment is too two-dimensional (compute and memory) which has little in common with the natural abundance of types of resources in the real world 17:49 < kanzure> it would be like giving biology two dimensions-- "energy" and "physical occupancy space in three-dimensions" (and i guess opcodes would be another "resource") 17:50 < fenn> what little physics they do implement is always external to the organism. "jump the farthest" but no elucidation of how muscles work 17:50 < fenn> you just get magical muscles out of nowhere for free 17:51 < kanzure> biology or e-life? 17:51 < fenn> e-life 17:51 < fenn> meanwhile, reality is massively, horribly complex, to an absurd degree 17:52 < fenn> even those illustrations of cells with all the proteins jam packed in there are still too simplified 17:53 < kanzure> nahhh what's the upper bound on protein state space in a cell, 10^500? can't be that bad! 17:59 < muurkha> why would you think it was so small? 18:00 < muurkha> .units 100 kg / 3.7e13 in daltons 18:00 < saxo> 100 kg / 3.7e13 = 1.6276056e+15 daltons 100 kg / 3.7e13 = (1 / 6.1439944e-16) daltons 18:00 < muurkha> so each cell is about 1e15 daltons 18:01 < muurkha> of which maybe 80% is water, so say 3e14 daltons of proteins and lipids and stuff 18:02 < muurkha> each amino acid is about 150 daltons, so that's about 2e12 amino acids per cell, if we disregard the lipids and nucleic acids and ATP and things 18:03 < muurkha> if we just pay attention to the sequence of amino acids, rather than folding and where the various proteins are, that's roughly 22**(2e12) 18:03 < muurkha> possible states 18:03 < muurkha> .units log(22) * 2e12 18:03 < saxo> Definition: 2.6848454e+12 18:04 < muurkha> so that's about 10**2.7e12 possible states 18:04 < muurkha> which is roughly 10**2.7e12 times larger than your 10**500 18:05 < muurkha> the number presumably gets much bigger if you try to take into account things like where the proteins are and what else they're stuck to at the moment 18:07 < muurkha> that's an average human cell though 18:10 < fenn> should say "which is roughly 10**2.7e10 times larger than your 10**500" 18:10 < fenn> but who's counting 18:10 < muurkha> nope, 10**2.7e12 18:11 < muurkha> suppose the actual number is 10**2 684 845 412 845 18:11 < muurkha> then it's 10**2 684 845 412 345 times bigger than 10**500 18:11 < muurkha> subtraction, not division, because this is the exponent 18:12 < fenn> oh, right. sorry 18:14 < muurkha> am I overlooking anything important in the calculation above? 18:30 < fenn> phonons and other kinds of conformational state 18:31 < fenn> i'm not sure what the point of the exercise is 18:32 < fenn> let's say each amino acid has 2 rotational degrees of freedom and a dozen vibrational modes. does it change the outcome? not much 18:36 < kanzure> in context i guess i should have commented about sequence space not state space (as the sequence space is the pertitent retainable information) 18:36 < kanzure> lots of cross-interaction between proteins and amino acids are probably statistically irrelevant 18:38 < muurkha> well, you could probably find a reduced-dimensionality representation that captures the interesting parts 18:38 < muurkha> but you can't do that a priori 19:04 -!- codaraxis [~codaraxis@user/codaraxis] has quit [Ping timeout: 260 seconds] 19:10 -!- yashgaroth [~ffffffff@2601:5c4:c780:6aa0::a324] has quit [Quit: Leaving] 19:18 < fenn> game engines are basically photorealistic now out of the box: https://www.youtube.com/watch?v=SrkuJlE17w4 (unreal engine 5.1) 19:18 < Muaddib> [SrkuJlE17w4] Anna in Cyborg Suit (5:05) 19:20 < fenn> no, not that 19:20 < fenn> https://www.youtube.com/watch?v=SrkuJlE17w4&t=3m15s 19:20 < Muaddib> [SrkuJlE17w4] Anna in Cyborg Suit (5:05) 19:20 < fenn> wtf 19:20 < fenn> https://www.youtube.com/watch?v=FUGqzE6Je5c&t=3m15s 19:20 < Muaddib> [FUGqzE6Je5c] Why Unreal Engine 5.1 is a Huge Deal (9:54) 19:21 < fenn> and in case you were wondering, no that's not my fetish 19:21 * fenn grumbles 19:21 < muurkha> haha 19:31 < muurkha> is this about Nanite? 19:34 < muurkha> apparently yes, at the 15 second mark 19:47 < fenn> the realtime bounce lighting is also very impressive 19:49 < fenn> as for "nanite", well, frankly i don't understand why dynamic level of detail is a new thing in 2022 19:50 -!- codaraxis [~codaraxis@user/codaraxis] has joined #hplusroadmap 19:50 < fenn> "pop in" should never have been a thing even with billboards. just fade between the different LODs smoothly 19:53 < muurkha> nanite is per-polygon dynamic level of detail without cracks or certain other artifacts that have plagued previous attempts 19:53 < muurkha> but it's not new in 02022 19:54 < jrayhawk> i was always surprised distant two-dimensional LODs weren't just re-rendered to a temporary framebuffer in some occasional amortized manner in the spare time between vsync and physics thread return 19:55 < jrayhawk> doing them statically always seemed real dumb 19:55 < muurkha> well, I guess it didn't ship to actual users until 02022. but Epic were doing Nanite demos in 02020 19:56 < fenn> yeah generated billboards would always match their context 19:57 < fenn> muurkha: i mean dynamic LOD should have appeared in 200X 19:57 < muurkha> it turned out to be harder than you think it is ;) 19:58 < muurkha> there were some dynamic LOD things going on in 200X, Josh Levenberg published a paper in 02002 or so 19:58 < muurkha> this sure does look like a fetish video 19:59 < muurkha> minute after minute of a young woman slowly putting on a rubber suit on top of another rubber suit 19:59 < muurkha> and then guess what? like half of all youtube porn videos, then she inflates it 19:59 < jrayhawk> it's somebody *else's* fetish 20:01 < muurkha> that definitely seems to be the motivation for filming this video, yes 20:15 < fenn> somehow josh levenberg's page hasn't changed at all since 2002 https://www.technomagi.com/josh/ 20:16 < fenn> 2003* 20:18 -!- redlegion is now known as redgreenlegion 20:24 -!- codaraxis__ [~codaraxis@user/codaraxis] has joined #hplusroadmap 20:27 -!- codaraxis [~codaraxis@user/codaraxis] has quit [Read error: Connection reset by peer] 20:54 < fenn> ok this is my fetish https://www.youtube.com/watch?v=ye6YpxFE9jk&t=5m 20:54 < Muaddib> [ye6YpxFE9jk] Backrooms - Reunion (13:12) 21:44 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has quit [Ping timeout: 260 seconds] 22:59 -!- WizJin [~Wizzy@user/WizJin] has quit [Quit: Leaving] 23:09 < maaku> Muaddib: should lsneff join the Starlink project or Starship project? 23:09 < Muaddib> maaku: I'd say he should hitch a ride on the Starship and take his own Starlink. 23:11 < maaku> kanzure: opencog folks seem to be double-downing on sybolic approaches rather than using GPT: https://groups.google.com/g/opencog/c/Yh5djrX9uhs/m/k9WRru0RAgAJ 23:11 < maaku> I haven't really paid attention to OpenCog these last five years or so though 23:15 < lsneff> if the details work out, i think i’ll probably go to starbase 23:19 < maaku> nice! 23:21 < maaku> kanzure: I remain convinced that OpenCog is a better approach to AI engineering as a standardized discipline. 23:22 < maaku> Develop highly specialized, highly capable subsystems (like GPT) and plug them into a common knowledgebase. 23:23 < maaku> In the literature this is called the blackboard architecture, I believe. 23:28 < maaku> But sadly I think the days of doing actual AI engineering are gone. Now it is faster to just try a model 100x the size, and/or search the hyper parameter space for a better network configuration 23:49 < fenn> "search the hyper parameter space" is just another way of saying "minimum viable self-improving AGI" --- Log closed Sat Dec 10 00:00:41 2022