• package root
    Definition Classes
  • package lamp

    Lamp provides utilities to build state of the art machine learning applications

    Lamp provides utilities to build state of the art machine learning applications


    Notable types and packages:

    • lamp.STen is a memory managed wrapper around aten.ATen, an off the heap, native n-dimensionl array backed by libtorch.
    • lamp.autograd implements reverse mode automatic differentiation.
    • lamp.nn contains neural network building blocks, see e.g. lamp.nn.Linear.
    • implements a training loop and other data related abstractions.
    • lamp.knn implements k-nearest neighbor search on the CPU and GPU
    • lamp.umap.Umap implements the UMAP dimension reduction algorithm
    • lamp.onnx implements serialization of computation graphs into ONNX format
    • contains CSV and NPY readers
    How to get data into lamp

    Use one of the file readers in or one of the factories in lamp.STen$.

    How to define a custom neural network layer

    See the documentation on lamp.nn.GenericModule

    How to compose neural network layers

    See the documentation on lamp.nn

    How to train models

    See the training loops in

    Definition Classes
  • package autograd

    Implements reverse mode automatic differentiaton

    Implements reverse mode automatic differentiaton

    The main types in this package are lamp.autograd.Variable and lamp.autograd.Op. The computational graph built by this package consists of vertices representing values (as lamp.autograd.Variable) and vertices representing operations (as lamp.autograd.Op).

    Variables contain the value of a Rn => Rm function. Variables may also contain the partial derivative of their argument with respect to a single scalar. A Variable whose value is a scalar (m=1) can trigger the computation of partial derivatives of all the intermediate upstream Variables. Computing partial derivatives with respect to non-scalar variables is not supported.

    A constant Variable may be created with the const or param factory method in this package. const may be used for constants which do not need their partial derivatives to be computed. param on the other hand create Variables which will fill in their partial derivatives. Further variables may be created by the methods in this class, eventually expressing more complex Rn => Rm functions.

    lamp.Scope.root{ implicit scope =>
      // x is constant (depends on no other variables) and won't compute a partial derivative
      val x = lamp.autograd.const(STen.eye(3, STenOptions.d))
      // y is constant but will compute a partial derivative
      val y = lamp.autograd.param(STen.ones(List(3,3), STenOptions.d))
      // z is a Variable with x and y dependencies
      val z = x+y
      // w is a Variable with z as a direct and x, y as transient dependencies
      val w = z.sum
      // w is a scalar (number of elements is 1), thus we can call backprop() on it.
      // calling backprop will fill out the partial derivatives of the upstream variables
      // partialDerivative is empty since we created `x` with `const`
      // `y`'s partial derivatie is defined and is computed
      // it holds `y`'s partial derivative with respect to `w`, the scalar which we called backprop() on

    This package may be used to compute the derivative of any function, provided the function can be composed out of the provided methods. A particular use case is gradient based optimization.

    Definition Classes
    See also for a review of the algorithm

    lamp.autograd.Op for how to implement a new operation

  • package data
    Definition Classes
  • package distributed
    Definition Classes
  • package extratrees
    Definition Classes
  • package knn
    Definition Classes
  • package nn

    Provides building blocks for neural networks

    Provides building blocks for neural networks

    Notable types:


    Modules facilitating composing other modules:

    • nn.Sequential composes a homogenous list of modules (analogous to List)
    • nn.sequence composes a heterogeneous list of modules (analogous to tuples)
    • nn.EitherModule composes two modules in a scala.Either

    Examples of neural network building blocks, layers etc:

    Definition Classes
  • package onnx
    Definition Classes
  • Converted
  • DefaultOpSet
  • DefaultOpSet1
  • NameMap
  • OpSet
  • Ops
  • VariableInfo
  • package saddle
    Definition Classes
  • package table
    Definition Classes
  • package umap
    Definition Classes
  • package util
    Definition Classes



package onnx

Linear Supertypes
AnyRef, Any
  1. Alphabetic
  2. By Inheritance
  1. onnx
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
  1. Public
  2. Protected

Type Members

  1. case class Converted(node: NodeProto, constants: Seq[TensorProto] = Nil) extends Product with Serializable
  2. trait DefaultOpSet1 extends OpSet
  3. trait NameMap extends AnyRef
  4. trait OpSet extends AnyRef
  5. case class VariableInfo(variable: Variable, name: String, input: Boolean, docString: String = "") extends Product with Serializable

Value Members

  1. def serialize(output: Variable, domain: String = "org.domain", modelDocString: String = "", opset: OpSet = DefaultOpSet)(infoFun: PartialFunction[Variable, VariableInfo]): ModelProto
  2. def serializeToFile(file: File, output: Variable, domain: String = "org.domain", modelDocString: String = "", opset: OpSet = DefaultOpSet)(infoFun: PartialFunction[Variable, VariableInfo]): Unit
  3. object DefaultOpSet extends DefaultOpSet1
  4. object Ops

Inherited from AnyRef

Inherited from Any