Autograd experiments in Clojure.
EPL-1.0 License
These are some experiments to bring a pytorch-like
autograd
1 2 library
to Clojure. This is not primarily a neural network library, but rather a means
of abstraction to calculate gradients for scientific computing. clj-autograd
is built with denisovan,
a core.matrix backend
for neanderthal at the moment, because
we want to be as fast as possible with linear algebra and build competitive deep
learning and Bayesian inference on top. We hope to keep the general autograd
machinery portable along core.matrix backends, but will explore optionally
inlined neanderthal operations, where they improve performance.
Feel free to drop in slack or gitter and ask me questions.
Do not consider the API stable yet, there will be quite some changes to a terse math notation.
To build a logistic regression model and do gradient descent with the automatic gradient, you can do something like this:
(let [X (tape (m/matrix [[5 2] [-1 0] [5 2]]))
Y (tape (m/matrix [1 0 1]))
c (tape 0 true)
b (tape (m/matrix [0 0]) true)]
(loop [i 1000]
(when (pos? i)
(let [Y* (sigmoid (add (mul X b) (broadcast-like c Y)))
out (bcrossent Y* Y)
grads ((:backward out) out 1)]
(when (zero? (mod i 100))
(prn "Loss:" @(:data out) ", b:" @(:data b) ", c:" @(:data c)))
(gd! grads)
(recur (dec i)))))
(is (= (seq @(:data (sigmoid (add (mul X b) (broadcast-like c Y)))))
'(0.9998655456159079 0.04813488562019285 0.9998655456159079))))
Look also at the tests for more examples.
Copyright 2017 Christian Weilbach
Distributed under the Eclipse Public License either version 1.0 or (at your option) any later version.