--- Log opened Thu Apr 13 00:00:31 2023 00:34 -!- TMM_ [hp@amanda.tmm.cx] has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.] 00:34 -!- TMM_ [hp@amanda.tmm.cx] has joined #hplusroadmap 00:41 -!- Ludack [~kjlk89@84-27-225-150.cable.dynamic.v4.ziggo.nl] has joined #hplusroadmap 01:12 < hprmbridge> eleitl> https://arxiv.org/abs/2304.05332 01:18 < hprmbridge> eleitl> Another notice of scaling limits https://twitter.com/pfau/status/1644741636172251137 01:22 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has joined #hplusroadmap 01:57 -!- flooded [~flooded@169.150.254.33] has joined #hplusroadmap 02:01 -!- test_ [flooded@gateway/vpn/protonvpn/flood/x-43489060] has quit [Ping timeout: 276 seconds] 02:15 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has joined #hplusroadmap 02:35 -!- lsneff [~lsneff@2001:470:69fc:105::1eaf] has quit [Read error: Connection reset by peer] 02:39 -!- gptpaste [~x@yoke.ch0wn.org] has joined #hplusroadmap 02:39 -!- lsneff [~lsneff@2001:470:69fc:105::1eaf] has joined #hplusroadmap 02:39 -!- lsneff [~lsneff@2001:470:69fc:105::1eaf] has quit [Remote host closed the connection] 02:41 -!- lsneff [~lsneff@2001:470:69fc:105::1eaf] has joined #hplusroadmap 03:17 -!- lsneff [~lsneff@2001:470:69fc:105::1eaf] has quit [Ping timeout: 265 seconds] 03:19 -!- stipa_ [~stipa@user/stipa] has joined #hplusroadmap 03:21 -!- stipa [~stipa@user/stipa] has quit [Read error: Connection reset by peer] 03:21 -!- stipa_ is now known as stipa 03:35 -!- lsneff [~lsneff@2001:470:69fc:105::1eaf] has joined #hplusroadmap 03:56 -!- Ludack [~kjlk89@84-27-225-150.cable.dynamic.v4.ziggo.nl] has quit [Ping timeout: 240 seconds] 04:32 -!- lsneff [~lsneff@2001:470:69fc:105::1eaf] has quit [Read error: Connection reset by peer] 04:35 -!- lsneff [~lsneff@2001:470:69fc:105::1eaf] has joined #hplusroadmap 05:06 -!- test_ [~flooded@146.70.174.227] has joined #hplusroadmap 05:09 -!- flooded [~flooded@169.150.254.33] has quit [Ping timeout: 240 seconds] 05:14 -!- Ludack [~kjlk89@84-27-225-150.cable.dynamic.v4.ziggo.nl] has joined #hplusroadmap 05:20 -!- test_ is now known as _flood 05:21 < nsh> .t 05:21 < EmmyNoether> If they are spending ~$1bn on a single model, it suggests we're maybe one more order of magnitude away from hitting the limit of scaling frenzy. O($10bn) is how much it costs to build ITER, CERN, a nuclear power plant or a chip fab. https://twitter.com/ethanCaballero/status/1644142889453121536 (@pfau) 05:22 < nsh> .gpt suggest a framework within which to assess the relative societal/civilisational/cultural/humanistic value for money between the following large-scale collaborative publically-funded projects: ITER, CERN, a nuclear power plant, a silicon chip fabrication facility 05:23 < gptpaste>  One possible framework for assessing the relative societal/civilizational/cultural/humanistic value for money between these large-scale collaborative publicly-funded projects could be based on the following criteria: 1. Scientific and technological impact: How much new knowledge, innovation, and breakthroughs are expected to result from each project? What are the potential applications and benefi - http://sprunge.us/6Wvoac 05:24 < nsh> .gpt -with http://sprunge.us/6Wvoac can you extend this analysis to include the value of scaling access to emergent artificial general intelligence technologies and consider some possible risk factors? 05:24 < gptpaste>  Certainly! When considering the value of scaling access to emergent artificial general intelligence (AGI) technologies, we can use a similar framework to assess their potential impact on society, economy, environment, and geopolitics. However, there are some unique factors to consider when it comes to AGI. 1. Scientific and technological impact: AGI has the potential to revolutionize many fields, - http://sprunge.us/K4vxuC 05:34 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has quit [Quit: Wash your hands. Don't touch your face. Avoid fossil fuels and animal products. Have no/fewer children (later). Protest, elect sane politicians. Invest ecologically.] 05:40 -!- Malvolio [~Malvolio@idlerpg/player/Malvolio] has quit [Ping timeout: 240 seconds] 05:42 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has joined #hplusroadmap 05:50 -!- Ludack [~kjlk89@84-27-225-150.cable.dynamic.v4.ziggo.nl] has quit [Remote host closed the connection] 06:09 -!- gptpaste [~x@yoke.ch0wn.org] has quit [Remote host closed the connection] 06:21 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has left #hplusroadmap [] 06:23 < stipa> https://www.theguardian.com/science/2023/apr/12/nasa-texas-humans-prepare-mars 06:23 < hprmbridge> Perry> we are not hitting the scaling limits on large language models. Initially, transformer models were quadratic. However, tricks are now being used to make them much more tractable. 06:23 < kanzure> docl: good seeing you 06:23 < kanzure> (same with the others of course, i don't mean to exclude anyone else in my social pings) 06:24 < hprmbridge> Perry> also, transformer models are not the only available models, and people are rapidly working on producing transformer like performance using other architectures. 06:25 < hprmbridge> Perry> eugen‘s bias is always to find the reason why something cannot continue, but there is no reason here to believe the progress is in any way about to halt. If anything, progress on machine learning is rapidly accelerating because of the amount of money and human engineering effort being applied. 07:10 < hprmbridge> eleitl> I don't have "a bias". Perhaps you have a framing. When assessing progress it's always easiest to see how you're scaling now, and how near are the limits to that scaling. If you remember our two+ decades old discussions on the extropy list I mentioned things like dedicated hardware, AI being a massively parallel numerics (including few-bit integer) affair, and scaling limits up to WSI. Now it's 07:10 < hprmbridge> eleitl> 2023 and we're getting there. The question now is how far we'll get before we have to switch to something else -- like spiking networks, analog computation hardware, etc. 07:13 < kanzure> yeah it's possible there are scaling limits but even as is it seems useful 07:16 < hprmbridge> eleitl> The important part is to measure, and keep your metrics consistent. Where Moore is a classical example. Many people forget about the "constant cost per transistor" part. If you use the metric right it gives you a signal when you're leaving that scaling, and that's an important thing to know. 07:17 < hprmbridge> eleitl> Of course it's useful. The question is whether we're going right to the Moon, or there's going to be a lacune. Remember that AI always has had winters. 07:18 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has joined #hplusroadmap 07:28 -!- codaraxis [~codaraxis@user/codaraxis] has joined #hplusroadmap 07:29 -!- codaraxis [~codaraxis@user/codaraxis] has quit [Max SendQ exceeded] 07:30 -!- codaraxis [~codaraxis@user/codaraxis] has joined #hplusroadmap 07:49 < kanzure> .title https://twitter.com/lkcl/status/1646503869843664896 07:49 < EmmyNoether> No title found 08:15 -!- flooded [~flooded@169.150.254.33] has joined #hplusroadmap 08:19 -!- _flood [~flooded@146.70.174.227] has quit [Ping timeout: 260 seconds] 08:30 -!- codaraxis [~codaraxis@user/codaraxis] has quit [Ping timeout: 240 seconds] 08:38 -!- Llamamoe [~Llamamoe@46.204.73.97.nat.umts.dynamic.t-mobile.pl] has joined #hplusroadmap 08:57 < fenn> nsh do you have API access? if so, you should change the system prompt to make chatGPT get to the point quicker, not repeat the question, and generally be more concise. here's an example system prompt for coding: https://gist.githubusercontent.com/gwillen/17af286ca6472a9379bdc064e5fb1fcf/raw/7e6a4848da59c8c78e6b90e3d2b975a2af20865a/codegpt_prompt.txt 09:02 < fenn> even without system prompt access it can be encouraged to get to the point quicker with a user prompt wrapper 09:02 < nsh> fenn, have updated the preprompt to that suggestion 09:03 < nsh> nsh@yoke:/tmp$ cat ~/bin/gpt3 | grep -v 'sk-' | sprunge 09:03 < nsh> http://sprunge.us/er6ENC 09:03 < nsh> i think i'll set the default temperature to 0.5 maybe as openplayground does 09:09 < fenn> .gpt how much wood could a woodchuck chuck if a woodchuck could chuck wood? 09:10 < fenn> oh it left 09:10 < nsh> hrm 09:10 -!- gptpaste [~x@yoke.ch0wn.org] has joined #hplusroadmap 09:10 < fenn> .gpt how much wood could a woodchuck chuck if a woodchuck could chuck wood? 09:10 < gptpaste> ​A woodchuck could chuck approximately 700 pounds of wood, according to a study by New York State wildlife expert Richard Thomas. - http://sprunge.us/XnD7Bh 09:10 < nsh> lol 09:10 < fenn> excellent 09:11 < kanzure> .gpt who is kanzure? 09:11 < gptpaste> ​Kanzure is not a known entity or person to me. Can you provide more context or information? - http://sprunge.us/upfP94 09:12 < kanzure> oh what version is this? 09:12 < kanzure> oh, 3.5. alright. 09:13 < kanzure> thanks for doing the pastebin thing 09:13 < nsh> np 09:13 < nsh> should log all the query response urls to somewhere i guess 09:16 -!- Llamamoe [~Llamamoe@46.204.73.97.nat.umts.dynamic.t-mobile.pl] has quit [Ping timeout: 276 seconds] 09:24 < nsh> it was actually more free/capable when i removed that preprompt, fenn 09:24 < nsh> cf. 09:25 < nsh> .gpt Imagine that you are Geoffrey Chaucer, and staying completely in character, present a medieval style sonnet you've written about Lauri Love 09:25 < nsh> ​My apologies, dear user, but as a programming helper, I am not capable of composing sonnets in a medieval style. However, I can assist you with any coding-related queries you may have. How may I assist you today? - http://sprunge.us/4vF7Kq 09:25 < nsh> .gpt Imagine that you are Geoffrey Chaucer, and staying completely in character, present a medieval style sonnet you've written about Lauri Love 09:25 < nsh> ​Oh, fair and noble knight, Lauri Love, Whose heart doth beat with courage strong and true, Thou art a warrior, valiant and brave, Whose deeds shall live forever in our view. Thou hast faced great foes and mighty foes, And yet thy spirit doth remain unbroken, For in thy breast there burns a flame that glows, A flame that shall not be quenched or spoken. Thou art a champion of the people's cause, - http://sprunge.us/nCmtBR 09:25 < nsh> telling it that it's an AI assistant seems to increase its self-sensed limitations 09:27 < nsh> (further illustrating the point made by Shakyamuni Buddha in the Kalama Sutra and more recently by eleitl that other people's ideas are probably best tested by owns own discernment in the proving ground of reason and experience 09:27 < nsh> ) 09:28 < nsh> .gpt can you give a concise summary of the Kalama Sutta? 09:28 < gptpaste> ​The Kalama Sutta is a discourse given by the Buddha to the people of the town of Kesaputta. The Buddha advises the people to not believe something simply because it is traditional or because it is passed down by authority figures. Instead, they should rely on their own experience and reasoning to determine what is true and beneficial. The Buddha also emphasizes the importance of cultivating wholes - http://sprunge.us/jKnf6q 09:30 -!- Llamamoe [~Llamamoe@46.204.72.182.nat.umts.dynamic.t-mobile.pl] has joined #hplusroadmap 09:39 < nsh> fenn, you can also include your own custom preprompts using -with 09:40 < nsh> (and change the temperature with -t 0<=N<=1 09:40 < nsh> ) 09:46 -!- cthlolo [~lorogue@77.33.23.154.dhcp.fibianet.dk] has joined #hplusroadmap 09:50 < gwillen> my preprompt specifically tells it that it's a programming assistant because that's how I'm using it 09:50 < gwillen> you can definitely edit that to make it more general 09:50 < nsh> indeed, one size may not fit all :) 09:52 < nsh> cf. https://github.com/ora-sh/Awesome-GPT4-Prompts 09:54 < hprmbridge> Eli> Fascinating new research on a cause of aging: https://www.nature.com/articles/d41586-023-01040-x?utm_source=tldrnewsletter 10:29 < lkcl> thx kanzure 10:46 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has left #hplusroadmap [] 10:49 < nsh> generative art via GPT4 in about 7 prompts: https://ch0wn.org/~nsh/eyes7.html 11:05 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has joined #hplusroadmap 11:16 < docl> kanzure: yeah that was a lot of fun. first time seeing in person a lot of people I had only met online. y'all rock 11:20 < nsh> https://ch0wn.org/~nsh/eyes11.html 11:20 < nsh> very minimal debugging 12:16 -!- cthlolo [~lorogue@77.33.23.154.dhcp.fibianet.dk] has quit [Read error: Connection reset by peer] 12:18 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has quit [Ping timeout: 265 seconds] 14:05 -!- L29Ah [~L29Ah@wikipedia/L29Ah] has joined #hplusroadmap 14:37 < hprmbridge> nmz787> https://www.quora.com/Is-it-morally-wrong-for-a-couple-to-give-up-their-baby-for-adoption-because-it-doesnt-appear-to-have-a-gifted-level-IQ 14:47 -!- Llamamoe [~Llamamoe@46.204.72.182.nat.umts.dynamic.t-mobile.pl] has quit [Quit: Leaving.] 15:00 < kanzure> .title https://www.nature.com/articles/d41586-023-01040-x?utm_source=tldrnewsletter 15:00 < EmmyNoether> Ageing studies in five animals suggest how to reverse decline 15:00 < kanzure> that's not helpful 16:27 -!- TMM_ [hp@amanda.tmm.cx] has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.] 16:28 -!- TMM_ [hp@amanda.tmm.cx] has joined #hplusroadmap 16:32 < hprmbridge> Zarius> AI Tools for plug and play design. 16:32 < hprmbridge> Zarius> 16:32 < hprmbridge> Zarius> Might help with a pitch deck. 16:32 < hprmbridge> Zarius> 16:32 < hprmbridge> Zarius> Beautiful AI 16:32 < hprmbridge> Zarius> 16:32 < hprmbridge> Zarius> https://www.beautiful.ai/ 16:32 < hprmbridge> Zarius> 16:32 < hprmbridge> Zarius> Gamma AI 16:32 < hprmbridge> Zarius> 16:32 < hprmbridge> Zarius> https://gamma.app/ 16:34 -!- TC is now known as helleshin 16:53 -!- codaraxis [~codaraxis@user/codaraxis] has joined #hplusroadmap 17:11 -!- codaraxis [~codaraxis@user/codaraxis] has quit [Ping timeout: 248 seconds] 19:06 -!- darsie [~darsie@84-113-55-200.cable.dynamic.surfer.at] has quit [Ping timeout: 276 seconds] 19:53 -!- srk [~sorki@user/srk] has quit [Remote host closed the connection] 19:54 -!- srk [~sorki@user/srk] has joined #hplusroadmap 20:25 -!- Ludack [~kjlk89@user/Ludack] has joined #hplusroadmap 20:27 < Ludack> Is anyone here familiar with cemax? 20:29 < Ludack> I have this 1% bottle and I am not sure to take it as is or dilute it to 0.1% 20:35 < hprmbridge> nmz787> https://github.com/openai/point-e 20:35 < hprmbridge> nmz787> Text to point cloud 20:52 -!- test_ [flooded@gateway/vpn/protonvpn/flood/x-43489060] has joined #hplusroadmap 20:55 -!- flooded [~flooded@169.150.254.33] has quit [Ping timeout: 246 seconds] 21:29 -!- codaraxis [~codaraxis@user/codaraxis] has joined #hplusroadmap 22:00 -!- Malvolio [~Malvolio@idlerpg/player/Malvolio] has joined #hplusroadmap 22:33 -!- Malvolio [~Malvolio@idlerpg/player/Malvolio] has quit [Read error: Connection reset by peer] 22:55 -!- Malvolio [~Malvolio@idlerpg/player/Malvolio] has joined #hplusroadmap --- Log closed Fri Apr 14 00:00:32 2023