havenwood changed the topic of #ruby to: Ruby 3.4.3, 3.3.8 https://www.ruby-lang.org | Log https://libera.irclog.whitequark.org/ruby
crespire has quit [Killed (NickServ (GHOST command used by crespire1))]
crespire1 has joined #ruby
Munto has quit [Ping timeout: 265 seconds]
Munto has joined #ruby
__jmcantrell__ has quit [Ping timeout: 244 seconds]
mange has joined #ruby
cappy has joined #ruby
ftajhii has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.]
ftajhii has joined #ruby
eddof13 has joined #ruby
eddof13 has quit [Client Quit]
cappy has quit [Quit: Leaving]
__jmcantrell__ has joined #ruby
Linux_Kerio has joined #ruby
__jmcantrell__ has quit [Quit: WeeChat 4.6.2]
phenom has quit [Ping timeout: 248 seconds]
Pixi` is now known as Pixi
phenom has joined #ruby
patrick has quit [Ping timeout: 252 seconds]
patrick_ is now known as patrick
patrick_ has joined #ruby
patrick has quit [Changing host]
patrick has joined #ruby
patrick_ is now known as patrick
patrick_ has joined #ruby
crespire has joined #ruby
Milos_ has joined #ruby
crespire1 has quit [Ping timeout: 252 seconds]
CalimeroTeknik has quit [Ping timeout: 252 seconds]
Milos has quit [Ping timeout: 252 seconds]
CalimeroTeknik has joined #ruby
Milos_ is now known as Milos
Linux_Kerio has quit [Ping timeout: 276 seconds]
grenierm has joined #ruby
rvalue has quit [Ping timeout: 265 seconds]
rvalue has joined #ruby
grenierm has quit [Ping timeout: 240 seconds]
grenierm has joined #ruby
<o0x1eef> OpenAI "tool calls" are not at all what I thought but also pretty cool. I can see how you could design agents with this API. The biggest surprise is that it is an API for calling code that's local to you rather than OpenAI itself. OpenAI just parses natural language into what is essentially a function call, and then says "hey, call your function X with args Y and Z".
andy-turner has joined #ruby
andy-turner has quit [Remote host closed the connection]
andy-turner has joined #ruby
sarna has joined #ruby
balrog has quit [Ping timeout: 252 seconds]
balrog_ has joined #ruby
grenierm has quit [Quit: Client closed]
Thanzex025 has quit [Ping timeout: 244 seconds]
TomyLobo has joined #ruby
CalimeroTeknik has quit [Changing host]
CalimeroTeknik has joined #ruby
Vonter has quit [Ping timeout: 260 seconds]
donofrio has joined #ruby
Vonter has joined #ruby
Vonter has quit [Ping timeout: 272 seconds]
Vonter has joined #ruby
<adam12> o0x1eef: I have a client right now using OpenAI tool calling to do some intersting things. I had never heard of it until I joined on with them.
GreenResponse has joined #ruby
nakilon has quit [Ping timeout: 276 seconds]
<o0x1eef> adam12: It has got me really excited
<o0x1eef> This is what I built with it so far -- a tool that can run any shell command via natural language: https://github.com/llmrb/llm/commit/1fcabd201f357d055a35ecaf0e25ad6f03df9fb6
<o0x1eef> I think in theory you could probably fake it on Gemini with JSON Schema as well
Linux_Kerio has joined #ruby
factor has quit [Read error: Connection reset by peer]
andy-turner has quit [Quit: Leaving]
factor has joined #ruby
andy-turner has joined #ruby
factor has quit [Client Quit]
<adam12> o0x1eef: Oh cool.
user71 has joined #ruby
mange has quit [Quit: Zzz...]
user23 has joined #ruby
Thanzex025 has joined #ruby
eddof13 has joined #ruby
eddof13 has quit [Client Quit]
gemmaro_ has quit [Ping timeout: 268 seconds]
gemmaro has joined #ruby
svm has joined #ruby
eddof13 has joined #ruby
msv has quit [Ping timeout: 252 seconds]
eddof13 has quit [Client Quit]
ross has left #ruby [#ruby]
svm has quit [Ping timeout: 265 seconds]
cappy has joined #ruby
peder has quit [Ping timeout: 252 seconds]
<o0x1eef> Nice
__jmcantrell__ has joined #ruby
peder has joined #ruby
peder has quit [Ping timeout: 245 seconds]
<havenwood> o0x1eef: LLM is looking nice! IMHO, switch from venture-backed Ollama to llama-cpp and llama-swap.
<havenwood> Ollama is a thin wrapper and adds little for me except a Golang runtime I'd just as well avoid. YMMV.
<o0x1eef> That's nice. I didn't know that. Agreed.
<o0x1eef> Wow, it even supports audio: https://github.com/mostlygeek/llama-swap
<havenwood> o0x1eef: I haven't missed a thing from Ollama and have liked a lot of odds and ends from both llama-cpp and llama-swap.
<havenwood> Also upstream and open source, so seems like wins all around.
<o0x1eef> Yeah, I am definitely going to look into this more. I'll add an issue to my local gitea instance. It is not entirely clear what changes I will need to make, sounds like I might need separate provider(s) but since there's some OpenAI compatibility it might be relatively easy
dviola has quit [Ping timeout: 276 seconds]
factor has joined #ruby
Sheilong has joined #ruby
peder has joined #ruby
diego has joined #ruby
diego has quit [Max SendQ exceeded]
diego has joined #ruby
peder has quit [Ping timeout: 276 seconds]
peder has joined #ruby
Linux_Kerio has quit [Ping timeout: 268 seconds]
diego has quit [Ping timeout: 248 seconds]
diego has joined #ruby
wbooze_ has joined #ruby
wbooze has quit [Ping timeout: 260 seconds]
wbooze_ is now known as wbooze
cappy has quit [Quit: Leaving]
sarna has quit [Ping timeout: 248 seconds]
sarna has joined #ruby
__jmcantrell__ has quit [Ping timeout: 276 seconds]
eddof13 has joined #ruby
GreenResponse has quit [Quit: Leaving]
andy-turner has quit [Quit: Leaving]
andy-turner has joined #ruby
andy-turner has quit [Remote host closed the connection]
eddof13 has quit [Ping timeout: 276 seconds]
nmollerup has joined #ruby
user71 has quit [Quit: Leaving]
__jmcantrell__ has joined #ruby
nmollerup has quit [Quit: Leaving]
__jmcantrell__ has quit [Ping timeout: 252 seconds]
ruby[bot] has quit [Remote host closed the connection]
ruby[bot] has joined #ruby
Sheilong has quit []
bhaak has quit [Ping timeout: 265 seconds]