SHAInet shard is a Neural network library, it supports many experimental features and cutting edge research while also providing things like basic capabilities for doing ML research in pure Crystal.
The library was put on hold while work has been happening on other fronts but we will be taking some time to fix, improve and optimize the project.
Right now it works and compiles in latest crystal (1.7.0) and all specs are green.
A few simple examples of things you can do with it is for example solving XOR!
require "shainet" training_data = [ [[0, 0], ], [[1, 0], ], [[0, 1], ], [[1, 1], ], ] # Initialize a new network xor = SHAInet::Network.new # Add a new layer of the input type with 2 neurons and classic neuron type (memory) xor.add_layer(:input, 2, :memory, SHAInet.sigmoid) # Add a new layer of the hidden type with 2 neurons and classic neuron type (memory) xor.add_layer(:hidden, 2, :memory, SHAInet.sigmoid) # Add a new layer of the output type with 1 neurons and classic neuron type (memory) xor.add_layer(:output, 1, :memory, SHAInet.sigmoid) # Fully connect the network layers xor.fully_connect # Adjust network parameters xor.learning_rate = 0.7 xor.momentum = 0.3 # data, training_type, cost_function, activation_function, epochs, error_threshold (sum of errors), learning_rate, momentum) xor.train( data: training_data, training_type: :sgdm, cost_function: :mse, epochs: 5000, error_threshold: 0.000001, log_each: 1000) # Run the trained network xor.run([0, 0])
What would you like to see? What features are missing for you? Let us know!