--- Log opened Mon Jul 10 00:00:55 2023 00:24 -!- TMM_ [hp@amanda.tmm.cx] has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.] 00:25 -!- TMM_ [hp@amanda.tmm.cx] has joined #hplusroadmap 00:39 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has joined #hplusroadmap 00:43 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 00:44 -!- stipa [~stipa@user/stipa] has quit [Ping timeout: 250 seconds] 00:44 -!- stipa_ is now known as stipa 00:49 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 00:51 -!- stipa [~stipa@user/stipa] has quit [Ping timeout: 246 seconds] 00:52 -!- stipa [~stipa@user/stipa] has joined #hplusroadmap 00:55 -!- stipa_ [~stipa@user/stipa] has quit [Ping timeout: 245 seconds] 00:59 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 01:01 -!- stipa [~stipa@user/stipa] has quit [Ping timeout: 252 seconds] 01:01 -!- stipa_ is now known as stipa 01:08 -!- catalase [catalase@freebnc.bnc4you.xyz] has quit [Quit: Stable ZNC by #bnc4you] 01:18 -!- TMA [tma@twin.jikos.cz] has quit [Ping timeout: 250 seconds] 01:18 -!- TMA [tma@twin.jikos.cz] has joined #hplusroadmap 01:27 -!- catalase [catalase@freebnc.bnc4you.xyz] has joined #hplusroadmap 02:18 < fenn> the real question is whether catgirls will have one or two sets of ears 03:18 < muurkha> stipa: fix your connection 03:37 < nsh> connection's fine. stuck in revolving doors 03:55 < fenn> https://old.reddit.com/r/aiArt/comments/14ln012/for_5_years_a_small_town_in_arkansas_has_hosted/ 04:04 < nsh> 'There's actually a huge controversy in the scene right now after a genetic engineer managed to re-activate dinosaur genetics in a newborn lizard. Authorities are keeping a close watch and may have to put it down as it's growing in size with no sign of stopping anytime soon, and is currently reaching 2 meters in height. The whole community is really split about it. This isn't even his craziest project though - he is famously known for splicing parrot DNA for 04:04 < nsh> wings, and recently has been making huge progress on a genetically engineered lactobacillus that can output diesel, fully built-in as a symbiotic colony into a custom engineered mouth pocket.' 04:04 < nsh> surprise kanzure isn't in touch 04:04 < nsh> *surprised 04:06 < fenn> they call it dino juice for a reason 04:20 -!- _flood [flooded@gateway/vpn/protonvpn/flood/x-43489060] has joined #hplusroadmap 04:44 < hprmbridge> kanzure> hm? 04:49 < fenn> just some fae news 05:46 -!- yashgaroth [~ffffffff@2601:5c4:c780:6aa0:ac18:c9c1:cb60:a224] has joined #hplusroadmap 06:04 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 06:05 -!- stipa [~stipa@user/stipa] has quit [Ping timeout: 246 seconds] 06:05 -!- stipa_ is now known as stipa 08:55 -!- flooded [flooded@gateway/vpn/protonvpn/flood/x-43489060] has joined #hplusroadmap 08:56 < stipa> muurkha: pay and i'll fix it 08:57 -!- test_ [flooded@gateway/vpn/protonvpn/flood/x-43489060] has joined #hplusroadmap 08:59 -!- _flood [flooded@gateway/vpn/protonvpn/flood/x-43489060] has quit [Ping timeout: 264 seconds] 09:00 -!- flooded [flooded@gateway/vpn/protonvpn/flood/x-43489060] has quit [Ping timeout: 240 seconds] 09:03 < docl> stipa: how much 09:04 < stipa> Eur 50 per month should be enough 09:04 < stipa> for fibre 09:05 < stipa> now is 4G 09:05 < stipa> it+s probably cells overheating 09:05 < stipa> it's 38°C overhere 09:06 < docl> hmm. I pay $4/mo for a ramnode instance for this 09:07 < stipa> that was pop over 15 years ago 09:08 < stipa> when ppl had 56k modems 09:08 < docl> ubuntu + tmux + irssi 09:08 < stipa> yeah, same here, manjaro + tmux + weechat via 4G 09:08 < stipa> no 09:08 < stipa> debian actually 09:09 < stipa> i have feeling that ppl with psybnc have trouble with nick highlighting and that kind of stuff 09:12 < stipa> i've got irssi on a BSD machine connected to OFTC 09:12 < stipa> als ovia tmux 09:12 < stipa> in a VM 09:13 < stipa> weechat has all the bells and whistles it takes ages to set up on irssi 09:15 < docl> I won a 56k modem in a sweepstakes from a shareware company called happy puppy once. it was a huge upgrade from the 14.4k modem my dad bought. so that's about how old I am. DSL existed but not where I lived 09:16 < docl> maybe I should try weechat and see if I like it better than irssi 09:18 < stipa> it's ok, set and forget type of thing 09:18 < stipa> works ok in tmux, it's mostly power outage on a server then tmux or weechat crash in my case 09:22 < docl> anyway irc is something that makes sense to do from ze clowd IMO, you don't have to bother people with disconnects constantly (the tmux session is running on the remote server, which is ~100% uptime because it's on a rack somewhere) 09:22 < stipa> DSL was super fancy thing when it came in my environment 09:22 < stipa> night and day compared to the 56k 09:22 < stipa> stable speed 09:23 < docl> for sure 09:23 < stipa> yeah, i like the feeling of having my own server 09:23 < stipa> it was my dream when i was a kid 09:24 < stipa> always connected machine 09:24 < stipa> i'm not responsible of bad ISP service 09:25 < stipa> that's not in my power to fix 09:25 < stipa> shitty ISP modems and such things... 09:26 < stipa> i do my best to do my part right, for everything else ppl will have to cope with 09:28 < docl> well ramnode is $4/mo. there are probably better options out there for ultra low scale. but it isn't the dream, I hear ya 09:29 < stipa> it's easy to pay for things but that also makes you dumb 09:29 < stipa> and without money 09:30 < stipa> but, i don't know what are your interests, i guess chemistry? 09:31 < stipa> i'm into electronics so i like it's technological side, paying someone else for IRC bouncer is just not acceptable 09:32 < stipa> if you're just into communicating with people via IRC then ramnode has sense 09:32 < docl> I'm a wide ranging interests guy, chemistry is recently acquired but computer science and electronics/hardware are interesting too. fun thing about irc is that everyone else is pretty into CS stuff too 09:34 < stipa> CS? 09:34 < docl> computer science 09:35 < stipa> computer scientists are on twitter and discogs 09:35 < stipa> IRC is just a bunch of Jacks 09:35 < docl> yeah but they're mostly jacks with strong CS. discord is gamers and twitter is literally everyone 09:36 < stipa> https://en.wikipedia.org/wiki/Jack_of_all_trades,_master_of_none 09:36 < docl> we have masters here too 09:37 < stipa> linkedin is full of CS dudes, those ppl have no idea what they're talking about 09:37 < docl> those are people optimizing for career 09:37 < stipa> there was a dude, like nice picture, authoritative and such stuff, the guy you don't want to mess with... 09:37 < stipa> he was talking about cloud tech 09:38 < stipa> man, he was speaking of i don't know what he even spoke about 09:38 < stipa> i said to him, cloud is just a bunch of computers running FTP servers renamed to cloud 09:39 < stipa> they think it's some kind of some voodoo hardware and stuff 09:39 < stipa> and they can explain it clearly 09:40 < stipa> at least they can but i have no clue what that is 09:40 < stipa> some even argued with me 09:40 < stipa> fuck, you know, they aren't even aware it's just a computer running software that act as servers 09:41 < stipa> that is a CS nowadays 09:42 < stipa> probably some dumb middlemen that have no clue what they're selling 09:42 < stipa> maybe they did some random CISCO course and think they're CS 09:43 < docl> that's business people trying to do CS... real CS is knowing how to make hardware do stuff. software lets you bridge from abstraction to reality. it's pretty neat. 09:44 < docl> there's a whole academic world of people going hyper abstract and then designing software to make a machine do that abstract thing as a real computation 09:46 < stipa> yeah 09:46 < stipa> all that power and nothing useful 09:47 < stipa> the abstraction trend started as a mean to bring chores of low level to the dumb masses 09:48 < stipa> what we have today is "abstraction" that practically doesn't work and waste of energy while it runs 09:49 < stipa> and you have CS experts that don't know how to turn on a computer selling that abstraction to ppl that have to rely with their lives on 09:50 < stipa> ppl have to eat 09:50 < stipa> some sell useless crap to survive and others suffer because of that 09:50 < docl> I don't agree that it's nothing useful, but I do agree that there's a lot of waste involved. people are looking for ways to spend joules instead of human-hours. I can't blame them, exactly, but makes me uncomfortable at times 09:52 < docl> a lot of people are probably going to get really into prompting AI without learning how it translates to reality very well 09:53 < stipa> i haven't seen anything useful of AI trend, i see just a bunch of machines capable of doing insane parallel operations at once, but there'll for sure be a talent or two who'll make something useful out of all that power 09:54 < stipa> i hope so 09:55 < docl> I think learning chemistry excited me so much because it gave some insights to pierce the veil between "plug these components together" vs "make atoms and electrons do things". people who mfg chips are turning chemistry into computing hardware, and there's something fascinating about that 09:56 < stipa> chip designers don't care about the chemistry 09:56 < stipa> they build structures 09:57 < docl> structures in the abstract? no, they have to figure out how to make actual atoms, silicon and doping elements, fit the structure. there's a distinction between a geometric shape in the abstract "a circle" vs an object with that property in physical reality 09:57 < stipa> https://www.youtube.com/watch?v=69mdJv6fWXE 09:58 < stipa> ^^ Explained How Chips Are Designed 09:58 < stipa> there's no chemistry, fabs give rules that has to be followed and that's it, depends on the manufacturing process 09:58 < nsh> pretty sure it's not independent of chemistry 09:58 < docl> http://elementsofprogramming.com/eop.pdf chapter 1 discusses abstract vs concrete things, matched my prior intuitions in an interesting way. I haven't quite fully wrapped my head around the concept of "regular" they are trying to teach though. 10:00 < stipa> chemistry part would i think be in the fab a thing 10:00 < stipa> maybe not even there 10:00 < stipa> probably fab buys the equipment and that's it 10:00 < stipa> having no clue what's going on in the machines 10:01 < stipa> secret upon a secret upon a secret.... 10:03 < stipa> especially with the newest nodes 10:06 < stipa> docl: thanks for the book 10:06 < docl> I guess regular = you can substitute it losslessly. so in a chip context they are looking at an electrochemical process (charge affecting conductivity -> precisely substitutable with the logic operation of a gate closing) and once you know that process will work you can rely on it and focus on just the logic. but of course stuff like "how many times do I need to perform this logical operation" matters 10:06 < docl> because you're physically closing a circuit each time, which consumes time and energy. 10:07 < stipa> and it has to have fast turn on and off times 10:08 < stipa> the faster the turn on the faster the chip is the more energy it needs to turn on fast 10:08 < stipa> so, you have a trend of a CPU sucking a kW or so 10:08 < stipa> it's nothing strange today 10:09 < stipa> you need a phase from the electricity grid just for a PC nowadays 10:11 < docl> then because you're implementing it as circuits that must spend energy and time to open and close, even if the cost is *very* small, certain things that work perfectly fine in pure math land like "if I do this infinity times" are unimplementable. so as computer scientists we're focusing on the subset of things you can do when you have that kind of limit on what you can do. so like you might be able to 10:11 < docl> search by brute force, but choosing an algorithm that does it faster at cost of storing more memory will just make more sense / get it done faster / be able to work for bigger problems. 10:15 < stipa> the idea was if we go smaller the energy consumption will as well but that didn't happen, by going smaller the frequency has risen and for fast switch on you need a lot of energy 10:23 < stipa> gates on transistors are practically capacitors 10:25 < stipa> if you want to charge a cap faster you need more current 10:29 < nsh> or you do it on a vehicle making a round trip involving velocities relativistic to your frame of reference 10:31 < docl> yeah, that's an intricacy worth understanding. if you can parallelize an algorithm / split it into many separate processes, you can go lower energy and do many slower things at once. so it's not just about trading "number of ops needed" vs "memory" in that context. and different kinds of tasks can be easier or harder to parallelize 10:32 < docl> btw some people here are thinking about how to make rod logic computers 10:33 < nsh> In Rod We Trust 10:34 < nsh> hmmm, up muurkha's street: https://anthony-zhang.me/blog/rod-logic/ 10:34 < nsh> i walked past the place where the Zuse Z's were built when i was in berlin recently 10:34 < nsh> or lived or something idk the plaque was in some gobblegook language 10:35 < docl> well, it's one of the interesting uses for nanotech (not that it has a shortage of interesting uses) 10:35 < nsh> if you're gonna make things smol then i'm not sure mechanical actuation makes more sense than electron states 10:35 < nsh> it's kinda nice at our scale 10:36 < nsh> and make down to cells 10:36 < nsh> (then again, i know nothing) 10:44 < docl> erik drexler had some math on this. logic gates that switch in 0.1 ns and dissipate 10^-21 J. 10^16 instructions per second per watt 10:44 < nsh> classical data is educated stupid. operations are cubic 11:01 < stipa> light could be the near future, i think there are already computer mobos with fibre interconnecting systems 11:02 < stipa> light based processors also work efficiently 11:02 < stipa> so like holographic memory and that things 11:03 < stipa> it's like powering a laser or an LED diode and that should be enough or smthn absurd like that, practically a few mA 11:05 < stipa> https://en.wikipedia.org/wiki/Optical_computing 11:06 < stipa> i think i saw optics flying around in this channel 11:07 < stipa> See How Lightelligence Created the World’s First Working Optical Computing System https://www.youtube.com/watch?v=a2merSqeVeI 11:12 < stipa> optoelectronics 11:16 < stipa> 100 x faster then the GPU 11:21 < superkuh> An exobyte of bandwidth? 11:21 < superkuh> /s 11:23 < hprmbridge> gourneau> https://www.nature.com/articles/s41467-023-38876-w 11:23 < kanzure> .title 11:23 < EmmyNoether> A biological camera that captures and stores images directly into DNA | Nature Communications 11:27 < docl> I wonder if that's anywhere near the theoretical limit for optics? current hash power to mine a bitcoin is around 10^18 ops. at the landaur limit, which is what drexler was apparently referencing, this would cost about a millijoule 11:44 < docl> https://en.wikipedia.org/wiki/Landauer%27s_principle 11:45 < muurkha> haha 12:04 -!- flooded [flooded@gateway/vpn/protonvpn/flood/x-43489060] has joined #hplusroadmap 12:07 -!- test_ [flooded@gateway/vpn/protonvpn/flood/x-43489060] has quit [Ping timeout: 245 seconds] 12:10 < stipa> docl: i would guess 100x faster is just an super shitty early prototype 12:11 < stipa> light is the fastest thing in the universe unless quantum maybe 12:11 < docl> yeah that would be my guess 12:12 < stipa> quantum would be like, let's skip all the computation and just get the value from the future 12:12 < docl> I'm not super confident about the landaur limit (I just know it's something a lot of smart people are willing to appear confident about, so maybe I should be) 12:13 < docl> reversible computing is supposed to be a workaround for the landaur limit though, somehow you don't have to erase it if you undo it 12:15 < stipa> that seem like it could be applied to electronics 12:17 < stipa> but low heat is also slow switching 12:17 < stipa> now does it apply to photonics too i wouldn't know 12:18 < stipa> maybe you also need a bunch of light in the photonic processor 12:18 < docl> well drexler says 0.1ns so maybe 10 GHz rod logic does approach the landaur limit. I wonder if you can overclock that in turn for faster rates 12:18 < stipa> to do complex operations 12:20 < stipa> yeah, 10GHz is super fast 12:21 -!- gptpaste [~x@yoke.ch0wn.org] has joined #hplusroadmap 12:21 < stipa> i9-13900KS Able to deliver up to 6.0 gigahertz (GHz) 12:22 < stipa> Intel is making the Core i9-13900KS on a 10 nm production node 12:22 < nsh> .gpt4 speculate on three or four ways we might surpass the information theoretic thermodynamic limit of Landauer's principle using novel forms of computation 12:23 < nsh> "As an AI language model, i cannae break the laws of physics, cap'n" 12:23 < stipa> so going smaller a bit 10Ghz will probably be reached 12:23 < gptpaste> ​1. Quantum Computing: Quantum computing is a promising field that could potentially surpass the limits of Landauer's principle. Unlike classical computers, which use bits as their smallest unit of data (either a 0 or a 1), quantum computers use quantum bits, or qubits, which can represent both 0 and 1 simultaneously thanks to the principle of superposition. This allows quantum - http://sprunge.us/9wPMu2 12:23 < nsh> sweet beautiful nonsense 12:24 < nsh> 3. Optical Computing: Another potential method for surpassing Landauer's limit might be optical computing or photonic computing where photons are used instead of electrons for processing information. Photons move faster than electrons and do not produce heat due to resistance as they don't have charge unlike electrons.' 12:24 < nsh> ah yes the notorious properties of photons that are emitted when wavefunctions change without any energetic coupling 12:24 < stipa> yeah, organic electronics is a bit much but promising 12:25 < stipa> that's probably cyborg stuff already 12:26 < stipa> photonics is the nearest 12:27 < stipa> if you look at humans as machines organic electronics suck a bunch of power 12:28 < stipa> via feeding 12:29 < stipa> nature probably did it as efficient as possible 12:29 < nsh> you can do quite a lot of thinking on a sandwich or two 12:29 < nsh> '3. Optical Computing: Another potential method for surpassing Landauer's limit might be optical computing or photonic computing where photons are used instead of electrons for processing information. Photons move faster than electrons and do not produce heat due to resistance as they don't have charge unlike electrons.' 12:29 < nsh> no 12:29 < nsh> 'Joseph Carlsmith estimates that the brain delivers roughly 11 petaFLOP/s (=1015 floating-point operations per second). If you eat a normal diet, you're expending roughly 10−13 J/FLOP.6 Sept 2022' 12:30 < nsh> oh dear i just cited lesswrong.org 12:30 < nsh> that's a paddlin' 12:30 < docl> you will be assimilated 12:31 < nsh> does floating-point even mean anything in the context of neurons 12:31 < nsh> as close to nothing as to be not worth making the distinction 12:31 < docl> hmm. it's not hard to store data in a light beam. just transmit it to a reflector at the distance in light-seconds away that corresponds to the time you want to read it out again 12:31 < nsh> oh no i drank some absinth and now my arithmetic coprocessor isn't working right 12:32 < nsh> ' Biological brains involve different principles than digital computers and these principles are not yet understood so that it is difficult to compare directly the computational capability of a human brain with a supercomputer. But some order-of-magnitude estimates (say based on the number of action potentials transmitted per second in an entire human brain between the approximately 1011 neurons that have about 1014 interconnections between each other) suggest 12:32 < nsh> that human brains also carry out the equivalent of 100-1000 petaflops of information processing. ' 12:33 < nsh> all of this nonsense was written back when people had very little appreciation of all the 'processing' that happens within neurons (rather than merely between) 12:34 < nsh> last quote from http://webhome.phy.duke.edu/~hsg/414/images/brain-vs-computer.html 12:34 < nsh> -- 12:34 < nsh> But consider the huge differences between the power and volume requirements of these two computing systems. A 100 petaflop supercomputer requires about 15,000,000 watts (enough power to support a city of about 10,000 homes), occupies an area of about an American football field of interconnected cabinets of CPUs, and requires a sophisticated and expensive cooling system to withdraw the large amount of heat produced. In contrast, your brain, even when solving a 12:34 < nsh> difficult physics problem, consumes about 15 watts (the power to keep lit a rather dim light bulb) and has a volume of about two fists. (And of course, brains are more impressive than supercomputers in other ways in that they self-assemble from a single cell, and they are self-learning entities that can master physics, math, language, art, music, and sports without being explicitly programmed.) 12:34 < nsh> -- (explicitly programming by a few billion years of evolution in a highly complex environment) 12:34 < nsh> *programmed 12:35 < nsh> the last paragraph is fun though 12:42 < docl> lol yeah, now I'm kind of picturing a superintelligence that works mainly by repeatedly producing a giant 3d grid of short lived billion-kelvin black holes and evaporating them to extract the data 12:49 < muurkha> with respect to https://anthony-zhang.me/blog/rod-logic/ I think Zhang has a few things to learn about fanout 12:50 < docl> but like 15 W should translate to... drexler says 10^9 ops per 10^-21 W -> 10^30 ops per watt at the landaur limit. so uh peta means 10^15 so a human at 1000 peta(fl?)ops is uh roughly one "bitcoin worth" per second? anyway 10^30 is a 10^12 bigger than that so... 1 trillion humans? we're 1/trillionth as efficient as we could be. if you trust 1000 peta-ops as a ballpark anyway 12:51 < nsh> one bitcoin worth is a pretty moving target 12:51 < nsh> usually up 12:51 < docl> oh yes. just saying. 12:54 < docl> actually that was calculated per watt so 15x as much... well, we spend way more watts on a meatbody than that, so it's really a lot more. anyway, meat brains look impressive vs silicon but not theoretical landaur limit pushing mechanisms 12:56 < nsh> we only end up in metabags because we fail to recognise our true nature by dying unmindfully 12:56 < nsh> ( https://www.britannica.com/topic/Bardo-Thodol ) 12:57 < nsh> .t https://www.youtube.com/watch?v=1TV11z9CxF4 12:57 < EmmyNoether> The Tibetan Book of the Dead - Bardo Reading in English - YouTube 12:59 < docl> so the window of opportunity to make bio computers a thing is finite. maybe 1k flops per 15 watt could be replicated and made digitally programmable, and if so yes you could beat the current (1 exa-op) btc network that way in energy costs, but it's not like this is pushing theoretical limits on energy->compute 13:12 < nsh> instead of computing more, one might (also) consider computing better 13:12 < nsh> we've been computing exponential more for decades now and the general experience has tended to degrade 13:13 < nsh> (at least in proportion to resource use) 13:13 < nsh> that are it's becoming clear that pumping entropy into a finite ecosystem is not necessarily the cleverest most we've made as a species 13:14 < nsh> *move 13:16 < docl> well computers are more general than brains. I may have a 10^18 "flop" computer in my skull but I suspect I won't have much luck training it to solve btc hashes 13:19 < stipa> NPUs should emulate neurons 13:20 < stipa> there is a neuron map of larva fly 13:21 < stipa> a genius or two could make it work with NPU for sure 13:21 < stipa> in the case everything is true about neural nets the NPU should emulate a larva brain 13:22 < stipa> same with a human brain which is hard to map due it's complexity 13:23 < stipa> if mapping and all that crap is scientifically viable the NPU that could run a human brain could think it's an actual human 13:24 < stipa> with some learning, or in case some random map already has an experience of example farming it could just be copied into another NPU 13:24 * L29Ah throws geniuses at stipa 13:26 < stipa> i mean, just a larva brain simulation would be insane 13:26 < stipa> existing in some random virtual world 13:27 < stipa> flapping it's wings and reacting to the environment 13:28 < stipa> same with human maps, poor chips thinking they're real 13:30 * L29Ah throws scifi literature at stipa 13:33 * stipa hands a blue pill to L29Ah 13:35 < juri_> this is the future of humanity, people. right here, right now. 13:37 < juri_> now if you'll excuse me, i'll go back to using todays tech to produce tomorrows algorithms, for yesterday's problems. 13:37 * stipa hands a red pill to juri_ 13:45 < nsh> you'd think morpheus would have like a pez dispenser or something 13:46 < nsh> i mean has he just got those things loose in his trenchcoat or what? 13:55 < docl> I've been reading about hermann grassmann (19th century projective geometric algebra guy), interesting stuff 14:21 < alethkit> Was it projective GA that he worked on, or just GA in general? 14:28 < juri_> I think he was just GA, but honestly, i don't pay much attention to the history parts of the lessons. 14:30 < nsh> i don't think Grassmann was must aware of their history when he was working on them either, tbf 14:42 < docl> https://core.ac.uk/download/pdf/231908059.pdf is what I was reading. his approach seems projective to me the way they describe it, but I could be wrong 14:55 < nsh> it's not but people have made projective grassmann (or in current vogue 'geometric') algebras. e.g. https://en.wikipedia.org/wiki/Grassmann%E2%80%93Cayley_algebra / https://projectivegeometricalgebra.org/ 14:56 -!- TMM_ [hp@amanda.tmm.cx] has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.] 14:56 -!- TMM_ [hp@amanda.tmm.cx] has joined #hplusroadmap 14:57 < nsh> the axioms of projective geometry by Whitehead was published in 1923 by which time Grassmann have been dead for almost fifty years 14:57 < nsh> actually 1906 14:57 < nsh> so thirty 14:58 < nsh> but it was building on work from the 19th century i guess 15:00 < nsh> 'However, there were some conceptual connections between Grassmann's ideas and projective geometry. Grassmann's theory of extension included the notion of a "meet" and "join" operation, which had similarities to the projective concepts of intersection and union. Grassmann's work also had implications for the understanding of duality in projective spaces.' 15:00 < nsh> (suggests entropymonster) 15:02 < juri_> I have my own projective geometric operators. with blackjack. 15:03 < docl> https://en.wikipedia.org/wiki/Whitehead%27s_point-free_geometry 15:05 < docl> talk about going hyper-abstract 15:05 < docl> > Point-free geometry was first formulated in Whitehead (1919, 1920), not as a theory of geometry or of spacetime, but of "events" and of an "extension relation" between events. 15:53 < muurkha> nsh: maybe you can help juri_ with her Clifford algebra code; she's written a CAD system in it 16:00 < nsh> dunno, don't like thinking or trying hard 16:01 < nsh> we have other people for that these days 16:02 < nsh> (i jest, poorly) is this system described anywhere? 16:02 < Hooloovoo> hmm, something didn't like that wp title 16:03 < Hooloovoo> https://en.wikipedia.org/wiki/Whitehead%27s_point-free_geometry does that work? 16:05 < nsh> .gpt4 could whitehead's point-free geometry be related to interval arithmetic? 16:06 < gptpaste> ​Whitehead's point-free geometry, also known as mereotopology, is a theory of space that is based on the concepts of part-whole and boundary rather than points. It focuses on regions and their relationships.Interval arithmetic, on the other hand, is a mathematical method that uses intervals or ranges of values instead of fixed numbers. This allows for more accurate representat - http://sprunge.us/IFyWat 16:06 < nsh> oh yes, one's qualitative and the other's quantitative. you're truly a genius. now where's that lamp 16:11 < nsh> "This paper advocates some very heretical views about the foundations of real and interval analysis. You can either burn me at the stake for blasphemy, or open your mind to the very simple and natural arguments that I give here." - http://www.paultaylor.eu/ASD/intawi.pdf 16:12 < nsh> god save all cranks and misfits 16:13 < nsh> another personal favourite 16:13 < nsh> .wik Pointless topology 16:13 < EmmyNoether> "In mathematics, pointless topology, also called point-free topology (or pointfree topology) and locale theory, is an approach to topology that avoids mentioning points, and in which the lattices of open sets are the primitive notions." - https://en.wikipedia.org/wiki/Pointless_topology 16:14 < nsh> oscar wilde would have appreciated it, if he wasn't so busy buggering pretty boys and being insulting curtains 16:14 < nsh> ( https://janwriter.medium.com/oscar-wilde-on-art-75c05aeeb9b ) 16:15 < nsh> -being 16:18 < nsh> 'A celebrated reviewer once described a certain paper (in a phrase which never actually saw publication in Mathematical Reviews) as being concerned with the study of "valueless measures on pointless spaces"' - https://projecteuclid.org/journals/bulletin-of-the-american-mathematical-society-new-series/volume-8/issue-1/The-point-of-pointless-topology/bams/1183550014.full 16:29 < docl> pointless, but is it also useless? 16:39 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has quit [Ping timeout: 245 seconds] 16:47 < docl> why is the "join" symbol ginormous and have stuff underneath it? 16:55 < nsh> where? 16:55 < docl> the font enlargening convention has been bugging me for a long time, usually I see it when summation is represented with capital sigma. surely there's a reason for it 16:55 < docl> on the pointless topology wiki page 16:59 < nsh> indeed it's the same convention 17:00 < nsh> ranging over i's belonging to I 17:00 < docl> more common examples described here (but nothing about why -- presumably it makes sense in a historical context) https://mathmaine.com/2010/04/01/sigma-and-pi-notation/ 17:01 < nsh> a product a sum and a union are all ways of considering some elements collectively 17:01 < nsh> likewise a join 17:02 < nsh> but the join is the one that looks like a V, the upside down one is the meet 17:02 < nsh> (akin to intersection) 17:02 < docl> ok that makes sense to me now 17:03 < nsh> you could translate the whole thing to a logical expression 17:04 < nsh> (being the expression of a distributive law) 17:07 < docl> https://en.wikipedia.org/wiki/Iterated_binary_operation 17:18 -!- TMM_ [hp@amanda.tmm.cx] has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.] 17:19 -!- TMM_ [hp@amanda.tmm.cx] has joined #hplusroadmap 18:01 < muurkha> I always forget which one is meet and which one is join 18:01 < muurkha> I think it's useful to think of meet and join as generalizations of minimum and maximum, or intersection and union 18:22 -!- test_ [flooded@gateway/vpn/protonvpn/flood/x-43489060] has joined #hplusroadmap 18:26 -!- flooded [flooded@gateway/vpn/protonvpn/flood/x-43489060] has quit [Ping timeout: 246 seconds] 18:56 < docl> nice mneumonic. I assume the downward V is minimum? 19:10 < muurkha> it's not a mnemonic 19:11 < muurkha> ∨ is maximum, as it happens, and ∧ is minimum 19:12 < muurkha> I mean, in Boolean algebra, with the usual assignments of numbers to truth-values rather than the Unix shell or Urbit assignment 19:12 < docl> oh, so it's the opposite... I guess you can just remember it as being backwards :/ 19:13 < docl> uh so what precisely do maxima and minima have to do with meet and join? I'm a little hazy on this 19:16 < muurkha> I always have to look it up, but apparently join is supremum ∨ and meet is infimum ∧ 19:17 < docl> well wiki says V = join and turned-V is meet 19:18 < docl> https://en.wikipedia.org/wiki/Pointless_topology this page is where I'm getting that from 19:19 < docl> I have a new akrasia hack I'm trying out: get super angry at the author of the confusing material and turn it into an obsession 19:27 -!- yashgaroth [~ffffffff@2601:5c4:c780:6aa0:ac18:c9c1:cb60:a224] has quit [Quit: Leaving] 19:42 < docl> I don't think it's works as the normal kind of anger which implies disrespect. it would be hard to genuinely disrespect euler (my current target). but it's easy enough to form a kind of faux outrage where I'm basically simulating a character that hates his guts. I wonder if people do this a lot and it's why nerds come across as bitter a lot of the time 19:51 < muurkha> docl: that's what I said, yes 20:18 < muurkha> I don't know how much this faux outrage happens but I certainly feel frustrated when I'm struggling to understand something, and that is kind of like being angry 20:51 < docl> yes, same here. I usually call it frustration rather than anger in my mental narrative. I guess I'm sort of questioning whether that's productive or not. because frustration seems to lead me to bounce off of things I'd rather not bounce off of, which is frustrating in and of itself 20:51 < muurkha> heh 21:48 < docl> interestingly, uneven-tempered personality trait correlates to reduced cognitive ability in various ways https://www.pnas.org/doi/10.1073/pnas.2212794120 21:49 < docl> (not open access, but the supplemental pdf is fine) 21:51 < docl> not a surprising result, angry does seem like an impairment. I'm hypothesizing that one can leverage it by turning it into obsessive focus (not really the same thing) 21:52 < docl> I'm considering it as an antidote to scott alexander's "oops I lost the lottery of being obsessed with math, guess it's kind of like being gay" https://slatestarcodex.com/2013/06/30/the-lottery-of-fascinations/ 21:54 < fenn> try lion's mane 21:54 < docl> because obsessions from anger seem like something people are really malleable, as opposed to like finding it intriguing 21:55 < fenn> i got obsessed with boho post-menopause yoga instructor vibes after taking lion's mane 21:55 < fenn> consider it a reroll 21:55 < docl> intriguing! 21:56 < fenn> previously i hated all that stuff, now i have just added it to the aesthetics i can appreciate 21:57 < fenn> it's worth considering if you want to actually BE a different person 21:58 < fenn> nerds come across as bitter because they recognize the injustice of the world, having been constantly exposed to it, but are powerless to actually do anything about it 21:59 < docl> that's what I assumed originally, but then it occurred to me they might be getting something useful out of it like obsession redirection 21:59 < hprmbridge> Eli> do you still take it? 22:00 < fenn> not regularly. i took it again in early 2022, trying to help a friend of mine by example 22:01 < muurkha> there are people who just make it a regular part of their diet, no? 22:01 < fenn> it's not unlikely that most people accumulate brain damage as they get older, and literally lose cognitive competencies. maybe we can reacquire those by creating new tissue to write on 22:02 < fenn> i had to eat salmon and sardines to get through the extreme fatigue. this is probably the growing neurons needing more DHA 22:02 < fenn> fatigue only lasted a day or two 22:02 < fenn> i don't know what would happen if you ate it every day 22:03 < muurkha> extreme fatigue from eating lion's mane mushrooms? 22:23 < fenn> yes 22:32 -!- ANACHRON [~Malvolio@idlerpg/player/Malvolio] has quit [Ping timeout: 250 seconds] 22:36 < muurkha> huh, interesting --- Log closed Tue Jul 11 00:00:55 2023