So is anyone doing interesting stuff by letting Crystal drive the LLM?
Having Gemini explain it to me, I realized that “agentic coding” is basically just giving the LLM some tools and letting it run in a feedback loop, and I guess that’s pretty much what’s driving the recent OpenClaw hoolabaloo. Was thinking it might be interesting to build a custom agent for something.
Not that Crystal is particularly suited for working with LLMs, but neither is a look of other languages that people are using to build LLM stuff (hello nodejs), so it might as well join the fun.
So are people actually using LLM for anything but writing code?
I’m fishing for anything where Crystal drives the LLM in some way. From full-blown agents to “I use a Crystal wrapper to make Claude colorize the output of ls".
Whoha… But of course, OpenClaw isn’t the only show in town, so why not in Crystal? In what way is it worse than the competition?
crybot is getting better and better every day. Sadly, I got my antigravity subscription cancelled (probably for using opencalw) so I abandoned the project of integrating that with it.
Also latest version is not in FreeBSD, but the maintainer has some family stuff dealing with at the moment. So might be a month or two more. The maintainer asked if there are any patches to get it going, if I have sometime, I may take my hand at it. I don’t think there is significant changes to require a lot of other work to get going, I just need to get something local so I can test proper instead of my VPSes.
Back when Auto GPT had first come out, I built Agent Amber as a CLI tool that was basically what OpenClaw is trying to be today.
Also, looks like OpenAI seized Anthropic’s bitterness about the whole topic and is now sponsoring that project in whole. So I thought that was fascinating.
But yeah, writing an agent harness in Crystal, totally doable. In fact, I think it’s the future, especially so that you can have super lightweight CPU driven programs that run the GPU heavy structure. Of applications, I think that’s a very real future that we’re working towards.