Release notes
General
-
Num.cr
frame
module has been removed. It was more of a proof of concept of a DataFrame knowing types at compile time, and until I have more time to work on it I would rather not have adding complexity to the library. -
Num::Rand
now uses Alea under the hood for random distribution generation. - All map / reduce iterators have been rewritten using
yield
patterns, speeding up standard iteration by around ~30% across the board. - Tensors can now be sorted, and sorted along axes
- Matrix exponentials using Pade approximation
Autograd (Num::Grad
)
- Pure crystal implementation of Autograd, tracking operations across a computational graph
- Currently supports most arithmetic operators, as well as slicing, reshaping, and matrix multiplication
Neural Networks (Num::NN
)
- Extended
Num::Grad
to add pure crystal machine learning algorithms, layers, and activations - Currently support Linear, Relu, Sigmoid, Flatten, 2D Convolutional layers, Adam and SGD optimizers, and sigmoid cross entropy and MSE loss (All written in pure crystal except for 2D convolution, which uses NNPACK).
I think the library is in a great place, it’s getting consistently faster, with more functionality being added, but I am still looking for other developers interested in numerical computing who would like to become core contributors.
That being said, if you are looking for a library to learn a lot more about the fundamentals of machine learning and automatic differentiation, and you’ve had a hard time understanding what goes on under the hood in a library like Tensorflow or Torch, I would encourage you to check out Num.cr