Does anyone deploy Crystal app on ARM based linux server? what is the deployment process like?

I want to purchase a server, i found ARM based linux server is so cheap …


1 Like

I wrote a server for the Gemini protocol that, until recently, ran on a Raspberry Pi 4. For my particular case, it was just a matter of getting a working compiler. I think I started with a statically compiled one from Alpine (I think?), then used that to build a newer version of the compiler and make a system package of that. From there, it was just shards build like usual.

I’m at the point where I need to start looking I to a VM or container, though, because the distro I use (Slackware) already upgraded to LLVM 16 on its AArch64 branch. Git is cumbersome to me, so applying the PR for LLVM 16 support locally is more of a pain.

I had an RPI3 for awhile I was using Crystal with for a bit. I ran a spacebucket controller to control a hydroponic setup I had which included a small webpage GUI with kemal which was used with the small touch screen on the front, manage the fans, lights, and dehumidifier using a DHT22. I would build it directly on the device itself and it was really easy. I wrote a simple github webhook to see if the repo had been written to, and if it had it would redownload the server, recompile, and restart it automatically. Whole process was easy enough, other than compiling Crystal from source to get it working.

1 Like

Make sure your development packages have ARM support first. I ran into this issue with Lexbor not having a package for linux on ARM (but it does have a Mac version for ARM). I ran into this when trying to run docker images locally for Linux. Obviously you can work around this, but beware as it may require a lot of additional configuring or compiling libraries from source.


I do, AWS Graviton2 (t4g) and Graviton3 (m7g) and some Ampere Altra (no, not the full machine with 80 cores, only small VMs part of the huge machine) :grin:

Lot of other hosting companies are starting to show offerings using Ampere Altra, so things looks promising in that front! :blush:

Since I still use Intel/AMD for development, I use my own container based on musl (Alpine Linux) to build static versions of the applications to deliver them.

For that, I combine Crystal + zig cc and the static version of the libraries needed for aarch64 and use crystal build --cross-compile.

There are a few caveats:

  1. If your app depends on OpenSSL, you need to be careful as it will try to use pkg-config and point to your local version of OpenSSL instead of the target one (trick is to set PKG_CONFIG_LIBDIR and point to the cross-compilation pkgconfig directory for aarch64.
  2. Even that compiles successufly, there might be issues with OpenSSL across distros, as it will not be able to find /etc/ssl/certs in some or find other issues. so far (:crossed_fingers: ) static musl runs without issues on Ubuntu 22.04 LTS
  3. Multi-threading is a bit flaky at this stage (-Dpreview_mt), I don’t have concrete examples, but encountered random errors that are hard to reproduce.

I showed a little bit of this in a few videos here and here.

I’ve also prepacked a list of dependencies in magic-haversack repository, so you only need to download, extract and then combine with your local Crystal and zig compiler to produce those static binaries targeting aarch64-linux-musl.

I was hoping to do part 3 showing automated cross-compilation using GitHub Actions but got a bit trickier to generate MacOS binaries (due MacOSX SDK version mismatch). More to come on that at some point :sweat_smile:



What is zig doing there? Surely you only need that for your project not to make the cross compilation work?

Short answer (tl;dr): No, you don’t need it, but you might want to use it. I’m using zig cc as portable cross-linker/compiler. You can read more about zig cc to help cross-compilation/linking process here.

Long answer: you need a linker that is capable to understand the target platform you’re cross-compiling to. Most of the times this takes the shape of entire gcc/clang and other utils prefixed with the target platform (Eg. aarch64-linux-gcc, aarch64-linux-ld, etc), sometimes provided by your distribution (sometimes), see musl-cross-make, and all the complications associated with that.

zig cc removes the need to have to figure out these cross-compilers: it includes musl source code and the necessary elements to target all their supported targets, reducing the complexity.

Combine that simplicity with the already existing static libraries that you need and you have a static binary for any of the platform (even macOS as target, but that is a bit more complex as it requires macOS SDK).

Done lot of cross compilation in the past (raw C projects and lot of dependencies) and Crystal + Alpine packages + Zig cc to solve the cross-compilation are a perfect and simple combination that works.



I never use zig, Is there possible borrow the force from zig, migrate those cross-compilation ability it into Crystal?

Crystal is quite capable on the cross-compilation part. The things that Zig does well is the cross-linking and the libc (musl, glibc or macOS one) quite transparent.

I’ve been following Zig’s development of that feature for a while I would say is quite complex. Implement something similar in Crystal would take time out of other features that perhaps are worth more of the team’s effort.


The things that Zig does well is the cross-linking and the libc (musl, glibc or macOS one) quite transparent.
Implement something similar in Crystal would take time out of other features that perhaps are worth more of the team’s effort.

I consider this is one of most excited part which make go/rust more portable than Crystal.

I would say that for Go, but not so much for Rust, which suffers the same issue about linking. There has been attempts like cargo-zigbuild to facilitate that, but you could still be bitten by glibc version mismatch (Something that with zig cc works much better, but still there are issues to solve).