--- Log opened Wed Feb 07 00:00:51 2024 01:12 -!- Malvolio is now known as mabeL 01:40 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has joined #hplusroadmap 04:28 -!- justanotheruser [~justanoth@gateway/tor-sasl/justanotheruser] has quit [Ping timeout: 255 seconds] 04:59 < kanzure> https://www.gregegan.net/MISC/CRYSTAL/Crystal.html 05:30 < hprmbridge> soul_syrup> is anyone here publishing on arxiv on robotics research? 05:42 < hprmbridge> soul_syrup> also would anyone here like to publish on arxiv with me ? I published a paper several months ago on neural signal analysis https://arxiv.org/abs/2402.03316, and will be publishing on my current project in a month (bio silicon synergetic learning systems https://github.com/Unlimited-Research-Cooperative/Human-Brain-Rat 06:52 < kanzure> "Orthogonal LoxPsym sites allow multiplexed site-specific recombination in prokaryotic and eukaryotic hosts" https://www.nature.com/articles/s41467-024-44996-8 06:54 < kanzure> from https://twitter.com/KevinVerstrepen/status/1755189757611516042 06:59 < kanzure> "Global Open Genetic Engineering Competition (Gogec), a free and open synthetic biology competition aimed at democratizing access to synthetic biology for student researchers globally ... We all know iGEM's fees are out of control" 07:34 -!- justanotheruser [~justanoth@gateway/tor-sasl/justanotheruser] has joined #hplusroadmap 07:40 -!- mrdata [~mrdata@user/mrdata] has quit [Ping timeout: 268 seconds] 09:23 -!- Hooloovoo [~Hooloovoo@hax0rbana.org] has quit [Ping timeout: 256 seconds] 09:26 -!- Hooloovoo [~Hooloovoo@hax0rbana.org] has joined #hplusroadmap 10:47 -!- millefy [~Millefeui@91-160-78-132.subs.proxad.net] has joined #hplusroadmap 11:02 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has quit [Read error: Connection reset by peer] 11:22 -!- millefy [~Millefeui@91-160-78-132.subs.proxad.net] has quit [Ping timeout: 276 seconds] 12:02 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has joined #hplusroadmap 12:04 -!- cthlolo [~lorogue@77.33.24.3.dhcp.fibianet.dk] has joined #hplusroadmap 12:14 < kanzure> after 20 years it's still the same "humanity+" crew https://bgi24.ai/ ("beneficial AGI summit") 12:19 -!- cthlolo [~lorogue@77.33.24.3.dhcp.fibianet.dk] has quit [Quit: Leaving] 14:22 < fenn> i want whatever drugs soul_syrup is on 14:46 < fenn> what does iGEM do with fees? 14:46 < hprmbridge> kanzure> pay for the enormous igem conference facility 14:47 < hprmbridge> kanzure> probably a $10m/year event 14:54 < fenn> it's a big facility but there are also a lot of attendees 14:54 < hprmbridge> kanzure> the fees pay for sending out the biobricks 14:55 < fenn> oo that's gotta be expensive, dotting some lysate on filter paper and stuffing it in an envelope 14:55 < fenn> ~ 14:56 < fenn> do these even go bad? can they just make a big batch of standard biobrick sheets and then re-use them for a decade? 14:57 < hprmbridge> kanzure> careful sisyphus 14:57 -!- TMM_ [hp@amanda.tmm.cx] has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.] 14:57 -!- TMM_ [hp@amanda.tmm.cx] has joined #hplusroadmap 14:57 < fenn> i'm having intrusive thoughts about matrix multiplication blocking tiling algorithms (it's the same parallelism problem) 14:58 < hprmbridge> kanzure> they get new bricks each year. but also, I might have heard they stopped shipping them? 14:58 < fenn> why stop shipping? 15:01 < fenn> also i want to just blort this here for searchability: " in vivo, multiplexed Gene Expression Modification by LoxPsymCre Recombination (GEMbLeR).This approach facilitates creation of large strain libraries, in which expression of every pathway gene ranges over 120-fold" 15:02 < fenn> it's a tortured acronym 15:03 < fenn> something about shuffling would have been nice 15:04 < fenn> just like every lab needs an artist in residence, each university should provide access to a punnery 15:05 < fenn> they could stock it with refugees from the de-funded humanities departments 15:06 < fenn> maybe we can train an LLM to do this 15:10 < fenn> iGEM fees are $6000 per team. to enter the "grand jamboree" it's an additional $3000 + $550 per team member 15:14 < L29Ah> gotta have punnery full of pundits 15:18 < fenn> there doesn't seem to be a limit to iGEM team sizes but the guide to building a team says 8 - 15 people 15:22 < fenn> Gogec is february 23 - 25 16:07 < hprmbridge> kanzure> why ship when the teams can just order from twist 16:46 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has quit [Ping timeout: 246 seconds] 17:12 -!- justanotheruser [~justanoth@gateway/tor-sasl/justanotheruser] has quit [Ping timeout: 255 seconds] 17:12 < hprmbridge> Eli> Is having multiple LLMs answer your prompt at the same time considered a Mixture of Experts 🤔 17:12 < hprmbridge> Eli> https://labophase.com/ 17:25 -!- justanotheruser [~justanoth@gateway/tor-sasl/justanotheruser] has joined #hplusroadmap 17:42 < fenn> because economies of scale 17:43 < fenn> synthesize once, run PCR to make infinite copies, spot from your PCR vat onto blotter paper infinity times 17:44 < fenn> ordering from twist is like custom CNC machined standardized nuts and bolts 17:49 < fenn> @Eli labophase actually seems like a good deal 17:50 < fenn> presumably you don't have to sell your soul to microsoft either 17:51 < fenn> they should be using mistral-medium instead of mixtral 17:55 -!- mrdata [~mrdata@user/mrdata] has joined #hplusroadmap 18:32 < hprmbridge> Eli> I'm just using the free tier. I'm wondering how much better these LLM's are really going to get? Like, is it just going to be more of a monotonic increase in performance? I spoke to one of the Jasper AI leads and he told me he thinks after gpt5/6 it's pretty much over. 18:41 -!- darius__ is now known as abecedarius 18:48 < hprmbridge> kanzure> "Electromagnetic modulation of monochromatic neutrino beams" https://arxiv.org/abs/1506.07883 18:49 < hprmbridge> kanzure> instead of undersea cables, why not transmit data via line of sight through the planet with neutrino beam modulation 18:55 < hprmbridge> alonzoc> Neutrino data link is the future of HFT I'm telling you! 18:55 < hprmbridge> alonzoc> The real issue is receivers would be insanely bulky for the same reason neutrinos are useful for this in the first place 18:56 < hprmbridge> alonzoc> If axions are real we could use really powerful magnetic fields to convert photons into axions and then back, that'd be another way to do point to point transmission through massive objects 20:34 -!- mrdata [~mrdata@user/mrdata] has quit [Ping timeout: 264 seconds] 20:49 -!- mxz [~mxz@user/mxz] has quit [Ping timeout: 260 seconds] 21:20 -!- Netsplit *.net <-> *.split quits: justanotheruser 21:20 -!- Netsplit over, joins: justanotheruser 21:20 -!- Netsplit *.net <-> *.split quits: geneh2, ike8, alethkit 21:21 -!- cc0 is now known as 076AAK90M 21:21 -!- Netsplit *.net <-> *.split quits: yorick, 076AAK90M, potatope, EmmyNoether, A_Dragon, redlegion 21:22 -!- Netsplit over, joins: 076AAK90M, potatope, A_Dragon, yorick, redlegion, EmmyNoether 21:22 -!- Netsplit over, joins: alethkit, geneh2, ike8 21:22 -!- alethkit [23bd17ddc6@sourcehut/user/alethkit] has quit [Max SendQ exceeded] 21:24 -!- Netsplit *.net <-> *.split quits: faceface, acertain_, FelixWeis__, cpopell_, Betawolf 21:24 -!- Netsplit *.net <-> *.split quits: TMA, streety, s0ph1a 21:24 -!- Netsplit over, joins: FelixWeis__, faceface, TMA, s0ph1a, streety, Betawolf, acertain_, cpopell_ 21:24 -!- alethkit [23bd17ddc6@sourcehut/user/alethkit] has joined #hplusroadmap 21:28 -!- Netsplit *.net <-> *.split quits: kanzure, gptpaste, RubenSomsen, archels 21:28 -!- Netsplit *.net <-> *.split quits: strages, nmz787, pasky, yuanti, jrayhawk 21:28 -!- Netsplit *.net <-> *.split quits: AugustaAva, srk, Jenda, otoburb, Hooloovoo, SDr 21:29 -!- Netsplit *.net <-> *.split quits: hprmbridge, nsh, L29Ah, Croran 21:29 -!- Netsplit over, joins: jrayhawk, strages, nmz787, pasky, yuanti 21:31 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has joined #hplusroadmap 21:31 -!- Croran [~Croran@user/Croran] has joined #hplusroadmap 21:31 -!- hprmbridge [~hprmbridg@user/fenn/bot/fennbots] has joined #hplusroadmap 21:31 -!- nsh [~lol@user/nsh] has joined #hplusroadmap 21:31 -!- RubenSomsen [sid301948@user/rubensomsen] has joined #hplusroadmap 21:31 -!- kanzure [~kanzure@user/kanzure] has joined #hplusroadmap 21:31 -!- gptpaste [~x@yoke.ch0wn.org] has joined #hplusroadmap 21:31 -!- archels [~neuralnet@static.65.156.69.159.clients.your-server.de] has joined #hplusroadmap 21:31 -!- mabeL [~Malvolio@idlerpg/player/Malvolio] has quit [Ping timeout: 260 seconds] 21:33 -!- Hooloovoo [~Hooloovoo@hax0rbana.org] has joined #hplusroadmap 21:33 -!- SDr [~SDr@user/sdr] has joined #hplusroadmap 21:33 -!- otoburb [~otoburb@user/otoburb] has joined #hplusroadmap 21:33 -!- srk [~sorki@user/srk] has joined #hplusroadmap 21:33 -!- Jenda [~jenda@coralmyn.hrach.eu] has joined #hplusroadmap 21:33 -!- AugustaAva [~x@yoke.ch0wn.org] has joined #hplusroadmap 21:34 -!- Netsplit *.net <-> *.split quits: catalase, Croran, Jenda, nsh, TMM_, srk, yuanti, superz, RubenSomsen, acertain_, (+49 more, use /NETSPLIT to show all of them) 21:36 -!- Netsplit over, joins: RubenSomsen, kanzure, @ChanServ, FelixWeis__, andytoshi, otoburb, jrayhawk, berndj, gwillen, faceface (+49 more) 21:42 -!- Netsplit *.net <-> *.split quits: Croran, Jenda, nsh, TMM_, srk, yuanti, RubenSomsen, acertain_, superkuh, hellleshin, (+43 more, use /NETSPLIT to show all of them) 21:44 -!- Netsplit over, joins: RubenSomsen, kanzure, @ChanServ, FelixWeis__, andytoshi, otoburb, jrayhawk, berndj, gwillen, faceface (+43 more) 21:45 -!- Goober_patrol66 [~Gooberpat@2603-8080-4540-7cfb-0000-0000-0000-113a.res6.spectrum.com] has quit [Remote host closed the connection] 21:45 -!- Goober_patrol66 [~Gooberpat@2603-8080-4540-7cfb-0000-0000-0000-113a.res6.spectrum.com] has joined #hplusroadmap 21:56 -!- Netsplit *.net <-> *.split quits: superz, Chiester 22:01 -!- Chiester [~Chiester@user/Chiester] has joined #hplusroadmap 22:01 -!- superz [~superegg@user/superegg] has joined #hplusroadmap 22:55 -!- mxz [~mxz@user/mxz] has joined #hplusroadmap 23:18 < fenn> @Eli there is so much groundbreaking AI stuff in the pipeline, it's hard to keep up 23:19 < fenn> @Eli this is all stuff you can use *right now* https://old.reddit.com/r/LocalLLaMA/comments/19fgpvy/llm_enlightenment/ 23:23 < fenn> this is just base model improvements, not even talking about fancy prompt techniques like tree of thoughts or stepwise self-evaluation, and then there's special purpose hardware that is being built for transformers (which are now already obsolete perhaps) 23:24 < fenn> going further down the hardware rabbit hole is analog crossbar switches and optical computers 23:25 < fenn> anyone just looking at numbers going up is missing the big picture, which is that an order of magnitude more people are working on AI now vs last decade 23:26 < fenn> that has network effects 23:26 < fenn> i mean GPT4 is *already* smarter than most humans 23:48 < fenn> earlier today i was reading about 13 millisecond image generation times in stable diffusion. this is using the same consumer hardware we had a year ago (4090) 23:54 < fenn> nobody has even touched fast feedforward networks yet (additional gating networks to optimally partition/sparsify a neural network based on learned token heuristics) 23:57 < fenn> a few months ago we started to get 3d shape generation from 2d diffusion models, and adding synthetic depth training data helps a lot with quality and consistency 23:57 < fenn> i haven't been specifically looking for engineering simulation AI but i don't see why this couldn't be done with what we have right now --- Log closed Thu Feb 08 00:00:02 2024