The Crystal Programming Language Forum

MXNet.cr, A Library for Deep Learning

I’ve been writing bindings to MXNet, a framework for deep learning (and machine learning, in general).

It’s time to show the shard to the community. I’d love feedback on the library, in particular on installation and usage.

I started down this road in Ruby so that I could work through Deep Learning - The Straight Dope without having to dust off Python or resort to any of the other supported languages. I eventually rewrote the library in Crystal, and plan to continue development here.

Installation

To use the shard you need to install MXNet. The library depends on libmxnet.so (the MXNet shared library), which is easiest to obtain by installing the Python “mxnet" package. For options on your platform, see: https://mxnet.apache.org/get_started

Usage

The core of MXNet is NDArray, a data structure that represents a homogeneous, multidimensional
matrix of values. NDArray supports fast matrix operations on a wide range of hardware configurations, including GPUs.

require "mxnet"
a = MXNet::NDArray.array([[1, 2], [3, 4]])
b = MXNet::NDArray.array([1, 0])
puts a * b

I’ve also ported a large part of Gluon, a high-level API for deep leaning.

net = MXNet::Gluon::NN::HybridSequential.new.tap do |net|
  # When instantiated, `HybridSequential` stores a chain of
  # neural network layers. Once presented with data, it executes
  # each layer in turn, using the output of one layer as the input
  # for the next. Calling `#hybridize` caches the neural network
  # for high performance.
  net.with_name_scope do
    net.add(
      MXNet::Gluon::NN::Dense.new(64, activation: :relu), # 1st layer (64 nodes)
      MXNet::Gluon::NN::Dense.new(64, activation: :relu), # 2nd hidden layer
      MXNet::Gluon::NN::Dense.new(10)
    )
  end
  net.init
  net.hybridize
end

Full docs here:

https://toddsundsted.github.io/mxnet.cr/

Let me know what you think!

9 Likes

This is pretty interesting. I dont do much ML but ArtLinkov and bararchy do and might be interested.