Machine Learning in Julia: Difference between revisions

From David's Wiki
Created page with "There are two main machine learning libraries in Julia, [https://fluxml.ai/ Flux.jl] and [https://github.com/denizyuret/Knet.jl Knet.jl] =Flux.jl= [https://fluxml.ai/Flux.jl/..."
 
No edit summary
 
(2 intermediate revisions by the same user not shown)
Line 2: Line 2:


=Flux.jl=
=Flux.jl=
[https://fluxml.ai/Flux.jl/stable/ Reference]
[https://fluxml.ai/Flux.jl/stable/ Reference]<br>
<syntaxhighlight>
See [[Flux]]
==Basic Usage==
<syntaxhighlight lang="julia">
using Flux;
using Flux;
using Flux.Tracker: update!;
using Flux.Tracker: update!;
Line 44: Line 46:
update!(model[1].W, -0.1 * gs[model[1[].W]);
update!(model[1].W, -0.1 * gs[model[1[].W]);
</syntaxhighlight>
</syntaxhighlight>
=Knet.jl=

Latest revision as of 14:20, 4 October 2019

There are two main machine learning libraries in Julia, Flux.jl and Knet.jl

Flux.jl

Reference
See Flux

Basic Usage

using Flux;
using Flux.Tracker: update!;

// Create layers like this
inputSize = 10;
outputSize = 20;
myLayer = Dense(inputSize, outputSize, relu);
// You can get the weights using
myLayer.W;
myLayer.b;
// You can call the model to make a prediction
model(myData);
// Equivalent to 
relu(model.W * myData + model.b);


// Create Networks like this
model = Chain(
  Dense(10, 20, relu),
  Dense(20, 2)
);

// Calling the model will pass data through each layer.
model(myData);

// Get the parameters for the whole model with
p = params(model);

// Calculate the gradient with
gs = Tracker.gradient(function()
  predicted = model(myData);
  loss = sum((predicted - myLabels).^2)'
  return loss;
end, params(model));
// Gradient of layer 1 weights
gs[model[1].W];
// Update! will update the weights and clear the gradient
// Make sure to update all layers.
update!(model[1].W, -0.1 * gs[model[1[].W]);


Knet.jl