--- Log opened Wed Feb 22 00:00:51 2023 01:37 -!- Croran [~Croran@c-73-118-187-65.hsd1.wa.comcast.net] has quit [Ping timeout: 252 seconds] 01:44 -!- Croran [~Croran@c-73-118-187-65.hsd1.wa.comcast.net] has joined #hplusroadmap 04:44 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 04:46 -!- stipa [~stipa@user/stipa] has quit [Ping timeout: 260 seconds] 04:46 -!- stipa_ is now known as stipa 05:01 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has quit [Read error: Connection reset by peer] 05:05 < kanzure> but really, what are these so-called good alternatives that make adult gene therapy work really really well? even speculatively what are they? 05:06 < kanzure> micromachines that crawl around in your body and find all the weird cell niches? nanobots? super viruses that we haven't invented yet? 05:06 < kanzure> or is the thinking that the class of problems that adult humans usually want to solve with gene therapy happen to also be the set of things that current gene therapy is able to solve? 05:06 < kanzure> wonder if there's a good "against gene therapy" article that outlines these problems- maybe something to write if it doesn't already exist 05:08 < kanzure> also, what about genetic changes where you need most of the cells in the body to change their expression profile ~simultaneously? inducible expression might be able to do that.. but maybe not. 05:08 < kanzure> the wikipedia article doesn't have a particularly strong section on limitations https://en.wikipedia.org/wiki/Gene_therapy#Adverse_effects,_contraindications_and_hurdles_for_use immunogenicity is listed, at least. 05:19 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has joined #hplusroadmap 05:42 < kanzure> here's what chatgpt says: https://diyhpl.us/~bryan/irc/chatgpt/adult-gene-therapy.txt 05:55 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has quit [Ping timeout: 255 seconds] 05:57 -!- yashgaroth [~ffffffff@2601:5c4:c780:6aa0:4d45:8f41:ebb3:6d1b] has joined #hplusroadmap 06:36 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has joined #hplusroadmap 07:10 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 07:10 -!- stipa [~stipa@user/stipa] has quit [Read error: Connection reset by peer] 07:10 -!- stipa_ is now known as stipa 07:42 < kanzure> https://diyhpl.us/~bryan/papers2/nanotech/Mechanical%20computing%20systems%20using%20only%20links%20and%20rotary%20joints%20-%202018.pdf 07:42 < kanzure> let's do that with proteins 07:43 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 07:45 -!- stipa [~stipa@user/stipa] has quit [Ping timeout: 255 seconds] 07:45 -!- stipa_ is now known as stipa 08:10 < kanzure> hmm "In about face, Hong Kong revokes visa for China's CRISPR baby scientist" 08:41 -!- ccdle12 [~ccdle12@243.222.90.149.rev.vodafone.pt] has joined #hplusroadmap 09:03 < kanzure> "De novo design of luciferases using deep learning" https://www.nature.com/articles/s41586-023-05696-3 09:32 -!- ccdle12 [~ccdle12@243.222.90.149.rev.vodafone.pt] has quit [Quit: Client closed] 10:07 -!- cthlolo [~lorogue@77.33.23.154] has joined #hplusroadmap 10:32 -!- CryptoDavid [uid14990@id-14990.uxbridge.irccloud.com] has joined #hplusroadmap 12:11 -!- codaraxis [~codaraxis@user/codaraxis] has joined #hplusroadmap 12:31 -!- cthlolo [~lorogue@77.33.23.154] has quit [Read error: Connection reset by peer] 12:36 < hprmbridge> nmz787> I wonder if chapgpt can give recommendations on getting onto darkweb sites 12:37 < hprmbridge> nmz787> Or obtain paywalled articles 13:08 -!- codaraxis [~codaraxis@user/codaraxis] has quit [Quit: Leaving] 13:15 -!- CryptoDavid [uid14990@id-14990.uxbridge.irccloud.com] has quit [Quit: Connection closed for inactivity] 16:10 -!- balrog [balrog@user/balrog] has quit [Quit: Bye] 16:28 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 16:29 -!- stipa [~stipa@user/stipa] has quit [Ping timeout: 252 seconds] 16:29 -!- stipa_ is now known as stipa 18:15 < kanzure> http://www.incompleteideas.net/IncIdeas/BitterLesson.html 19:18 < fenn> the bitter lesson is that AI research is super boring 19:18 < fenn> as soon as you start to do anything interesting, it's not AI research anymore 19:19 < kanzure> is that a comment about returns on research not being incremental/engaging along the way? 19:37 < fenn> you're basically flailing around randomly in the space of permutations of mathematical lego blocks, doing trial and error, with lots of waiting for results to come back after training 19:38 < fenn> i could be wrong, and some super talented AI researcher has an intuition for what's better, but if that's true i haven't heard anything about it 19:40 < kanzure> so you'd say it's all pretty close to bruteforce? 19:41 < kanzure> but, there are many kinds of things you could try to do in an attempt to bruteforce, like adding numbers together infinitely or something 19:42 < kanzure> we need a resident machinist for the channel 19:44 < fenn> i know a little about machining 19:44 < fenn> by which i mean i spent several years of my life on it 19:45 < kanzure> the kind of atoms or the kind of bits? 19:45 < kanzure> machine-learnerist 19:45 < fenn> the kind where you shove a sharp pointy thing into a block of material and it scrapes off a piece of the material 19:46 < kanzure> i was looking around unsuccessfully for a study where someone just goes into a giant weighted ANN or language model and deletes weights using their intuition or human learning process 19:46 < kanzure> to see if there can be identified clumps of patterns based on different traces for different input/output results 19:47 < fenn> well you can typically reduce the size of a model until it stops working, but you have to re-train it each time you change the size 19:47 < fenn> probably there are pruning algorithms 19:48 < kanzure> visualgpt6502.org 19:48 < fenn> Yann LeCun’s 1990 paper Optimal Brain Damage http://yann.lecun.com/exdb/publis/pdf/lecun-90b.pdf 19:49 < fenn> how you gonna visualize 175 billion parameters? 19:50 < kanzure> with one of those gigapixel viewers 19:54 < fenn> "sort the parameters by saliency and delete some low-saliency parameters, rinse and repeat" 19:55 < fenn> i feel like defragging a neural network is going to be important, so you're not moving data all over the place 19:56 < fenn> neurons that are typically active at the same time should be closer together 19:57 < kanzure> "this neuron activates on the mathematical abstract concept of zero but also your old smelly socks" 19:57 < kanzure> (it's probably groups of neurons of course, not everything is a mirror face neuron) 19:59 < fenn> i'd like to live in a world where pieces of neural networks are interchangeable and you don't have to spend a billion dollars retraining everything from the ground up every time you want to add a new domain 19:59 < fenn> defragmenting probably has something to do with that, in my uneducated intuition 20:00 < kanzure> instead of in vitro neural tissue culture what about a neural network made up of human centipede 20:01 < fenn> it wouldn't work 20:01 < kanzure> drosophila fly matrix where each one acts as a giant neuron- probably looking at a pixel screen and the 'axon' is its movement, and the data gets transmitted to another drosophila's screen 20:02 < kanzure> meta-evolution/AI techniques can be applied to the circuit wiring of these 'neurons' 20:03 < fenn> how do you reward them? 20:03 < kanzure> yeah i was wondering about that, even for human brain biological neurons how are those rewarded 20:03 < fenn> "what fires together wires together" 20:04 < kanzure> feeding them is also somewhat difficult, apparently drosophila are fed by damp cotton ball good luck designing a mechanical system to scale tha tup 20:04 < kanzure> *that up 20:04 < fenn> cotton ball on a chip 20:05 < fenn> a random forest of carbon nanotubes soaked in sugar water 20:05 < fenn> this scheme sounds very slow 20:05 < fenn> when you said "human centipede" i thought you meant frankensteining a bunch of vivisected brains together 20:05 < kanzure> i was hoping there would be literature about "networks of networks" to see if there is any math about potential benefit of having neurons implemented by (drosophila's) 100,000 neurons 20:06 < kanzure> sorry for the misunderstanding, it's because i never actually saw the movie or looked at what the concept was 20:06 < fenn> don't bother, it's dumb 20:06 < fenn> but yes, the output of one node becomes the input of the next node 20:07 < kanzure> this is why i was pestering nsh the other day about twitchplayspokemon and whether there have been any multi-human searle room experiments 20:07 < fenn> pff, DO AN EXPERIMENT? what do you think this is, science?! 20:08 < fenn> 100,000 to 1 is a lot of dimensionality reduction per layer 20:08 < kanzure> with humans i think part of the issue is that you need the sum of the parts to produce an outcome that is more interesting than a single person, otherwise you would just use a single person 20:08 < fenn> isn't this just "the wisdom of the crowds" 20:09 < kanzure> so for a human compute matrix you would probably want to let the human use words or other structured media for outputs or something 20:09 < kanzure> apparently no 20:09 < kanzure> wisdom of the crowds is like, everyone types an answer and they upvote or something 20:10 < fenn> how is it different from a typical hierarchical bureaucracy with information flowing from bottom to top 20:10 < kanzure> it may be possible to achieve surprisingly strong outcomes by reducing the exercise to neuronal computation instead of normal human social structures/activities 20:12 < kanzure> i wouldn't want to join the borg if it was just a cacophony of voices shouting at each other 20:13 < fenn> but you get to live forever and travel at warp 9 20:14 < kanzure> twitchplayspokemon is an interesting platform for some research- taking the average keypress over a timespan is kind of silly, but maybe people get to join and play the role of a neuron somewhere in the network 20:17 < kanzure> "welcome to the neural network; you are now a neuron. you will see things on your screen. press buttons on your keyboard as you feel appropriate, or you can rest when you want. your score will be in the upper right hand corner." 20:17 < fenn> in real brains there are inhibitory neurons 20:17 < kanzure> isn't that just an inverse 20:18 < fenn> it stops a neuron from transmitting further down the chain 20:18 < kanzure> yeah i think that can be implemented. people login and you assign them to different types or roles (but they don't know what that is nor do i think they need to?). 20:18 < kanzure> and the system also comes up with the overall connectivity graph 20:18 < fenn> i imagine if you have twitch viewers self-sorting by coincident button pressing, you'd rapidly end up with a single contingent that just mashes B constantly, winning out over all the others and ruining the game 20:19 < fenn> inhibitory neurons are the solution to this 20:19 < kanzure> hm? that's not what happened with twitchplayspokemon. somehow they beat the game. 20:20 < kanzure> https://en.wikipedia.org/wiki/Twitch_Plays_Pok%C3%A9mon 20:21 < fenn> ok there are too many pokemon games 20:23 < kanzure> nice touch "Uses a modded version, with 251 species of Pokémon available and the final boss Red's team consisting of the same team from Twitch Plays Pokémon Red." 20:29 < fenn> in principle i dont see any reason why you couldn't shove lots of wires into a drosophila brain and send its inputs and outputs over a network to other drosophila brains, thereby creating a superbrain 20:29 < fenn> but like, why bother with the drosophila brains at that point 20:30 < hprmbridge> kanzure> the brain implant intervention increases cost too much. 20:30 < hprmbridge> kanzure> also, the idea of using biological brains is scale 20:31 < fenn> squishy brains can't have their state directly read out or written to, and they have complex chemical support systems, and they eventually die for no particular reason 20:31 < hprmbridge> kanzure> also they have good neuron density 20:32 < fenn> i bet 3d semiconductor neurons have better density 20:32 < fenn> especially once you factor in the speed increase 20:32 < fenn> if you ran a CPU at 300Hz it wouldn't generate that much heat 20:32 -!- yashgaroth [~ffffffff@2601:5c4:c780:6aa0:4d45:8f41:ebb3:6d1b] has quit [Quit: Leaving] 20:36 < hprmbridge> kanzure> you can make more biological neurons than semiconductor neurons 20:37 < hprmbridge> kanzure> how many layers is your 3d semiconductor? 20:39 < fenn> a million 20:40 < fenn> enough to make it a volume, not a plane 20:40 < hprmbridge> kanzure> has this been implemented? 20:40 < fenn> of course not 20:41 < fenn> that would require several million fab steps 20:42 < hprmbridge> kanzure> hence biological brains 20:42 < fenn> why is crossbar ram not a thing yet 20:44 < fenn> https://www.crossbar-inc.com/technology/reram-overview/ 20:47 < hprmbridge> kanzure> if you have a specific question for them, I can ask mark davis or sung hyun jo, I have them on signal 20:48 < fenn> no, just complaining i guess 20:49 < hprmbridge> kanzure> they do have real chips that have shipped reram 20:49 < fenn> you can actually buy intel optane PCIe bus RAM 20:50 < fenn> i would have expected this to blow everything else out of the water, but that didn't happen 20:50 < hprmbridge> kanzure> one reason might be that they originated from mobile phone chip industry so that's what they know best 20:51 < hprmbridge> kanzure> they might not be aware of the other opportunities 20:52 < hprmbridge> kanzure> also bunnie has their ear lately 20:58 < fenn> crossbar's "high performance" memory has a 12 microsecond write time (vs like 1ns for a mediocre DRAM) 21:01 < fenn> 7ns for the persnickety 21:14 < fenn> a semiconductor neuron ASIC should probably end up looking a lot like crossbar or 3D-xpoint, but with a different connectivity topology 21:16 < fenn> "analog deep learning" or "neuromorphic computing" are the relevant buzzwords 21:18 < fenn> "Neuromemristive systems" 21:37 -!- hellleshin [~talinck@108-225-123-172.lightspeed.cntmoh.sbcglobal.net] has joined #hplusroadmap 21:40 -!- helleshin [~talinck@108-225-123-172.lightspeed.cntmoh.sbcglobal.net] has quit [Ping timeout: 248 seconds] --- Log closed Thu Feb 23 00:00:52 2023