package util
- Alphabetic
- By Inheritance
- util
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Lamp provides utilities to build state of the art machine learning applications
Lamp provides utilities to build state of the art machine learning applications
Notable types and packages:
Use one of the file readers in lamp.io or one of the factories in lamp.STen$.
See the documentation on lamp.nn.GenericModule
See the documentation on lamp.nn
See the training loops in lamp.data.IOLoops
Implements reverse mode automatic differentiaton
Implements reverse mode automatic differentiaton
The main types in this package are lamp.autograd.Variable and lamp.autograd.Op. The computational graph built by this package consists of vertices representing values (as lamp.autograd.Variable) and vertices representing operations (as lamp.autograd.Op).
Variables contain the value of a Rn => Rm
function. Variables may also
contain the partial derivative of their argument with respect to a single
scalar. A Variable whose value is a scalar (m=1) can trigger the computation
of partial derivatives of all the intermediate upstream Variables. Computing
partial derivatives with respect to non-scalar variables is not supported.
A constant Variable may be created with the const
or param
factory
method in this package. const
may be used for constants which do not need
their partial derivatives to be computed. param
on the other hand create
Variables which will fill in their partial derivatives. Further variables
may be created by the methods in this class, eventually expressing more
complex Rn => Rm
functions.
lamp.Scope.root{ implicit scope => // x is constant (depends on no other variables) and won't compute a partial derivative val x = lamp.autograd.const(STen.eye(3, STenOptions.d)) // y is constant but will compute a partial derivative val y = lamp.autograd.param(STen.ones(List(3,3), STenOptions.d)) // z is a Variable with x and y dependencies val z = x+y // w is a Variable with z as a direct and x, y as transient dependencies val w = z.sum // w is a scalar (number of elements is 1), thus we can call backprop() on it. // calling backprop will fill out the partial derivatives of the upstream variables w.backprop() // partialDerivative is empty since we created `x` with `const` assert(x.partialDerivative.isEmpty) // `y`'s partial derivatie is defined and is computed // it holds `y`'s partial derivative with respect to `w`, the scalar which we called backprop() on assert(y.partialDerivative.isDefined) }
This package may be used to compute the derivative of any function, provided the function can be composed out of the provided methods. A particular use case is gradient based optimization.
https://arxiv.org/pdf/1811.05031.pdf for a review of the algorithm
lamp.autograd.Op for how to implement a new operation
Provides building blocks for neural networks
Provides building blocks for neural networks
Notable types:
Optimizers:
Modules facilitating composing other modules:
Examples of neural network building blocks, layers etc:
W X + b
with parameters W
and b
and input
X