Machine Learning in Julia: Difference between revisions
Created page with "There are two main machine learning libraries in Julia, [https://fluxml.ai/ Flux.jl] and [https://github.com/denizyuret/Knet.jl Knet.jl] =Flux.jl= [https://fluxml.ai/Flux.jl/..." |
|||
Line 3: | Line 3: | ||
=Flux.jl= | =Flux.jl= | ||
[https://fluxml.ai/Flux.jl/stable/ Reference] | [https://fluxml.ai/Flux.jl/stable/ Reference] | ||
<syntaxhighlight> | <syntaxhighlight lang="julia"> | ||
using Flux; | using Flux; | ||
using Flux.Tracker: update!; | using Flux.Tracker: update!; |
Revision as of 12:47, 26 September 2019
There are two main machine learning libraries in Julia, Flux.jl and Knet.jl
Flux.jl
using Flux;
using Flux.Tracker: update!;
// Create layers like this
inputSize = 10;
outputSize = 20;
myLayer = Dense(inputSize, outputSize, relu);
// You can get the weights using
myLayer.W;
myLayer.b;
// You can call the model to make a prediction
model(myData);
// Equivalent to
relu(model.W * myData + model.b);
// Create Networks like this
model = Chain(
Dense(10, 20, relu),
Dense(20, 2)
);
// Calling the model will pass data through each layer.
model(myData);
// Get the parameters for the whole model with
p = params(model);
// Calculate the gradient with
gs = Tracker.gradient(function()
predicted = model(myData);
loss = sum((predicted - myLabels).^2)'
return loss;
end, params(model));
// Gradient of layer 1 weights
gs[model[1].W];
// Update! will update the weights and clear the gradient
// Make sure to update all layers.
update!(model[1].W, -0.1 * gs[model[1[].W]);