Packages

  • package root
    Definition Classes
    root
  • package lamp

    Lamp provides utilities to build state of the art machine learning applications

    Lamp provides utilities to build state of the art machine learning applications

    Overview

    Notable types and packages:

    • lamp.STen is a memory managed wrapper around aten.ATen, an off the heap, native n-dimensionl array backed by libtorch.
    • lamp.autograd implements reverse mode automatic differentiation.
    • lamp.nn contains neural network building blocks, see e.g. lamp.nn.Linear.
    • lamp.data.IOLoops implements a training loop and other data related abstractions.
    • lamp.knn implements k-nearest neighbor search on the CPU and GPU
    • lamp.umap.Umap implements the UMAP dimension reduction algorithm
    • lamp.onnx implements serialization of computation graphs into ONNX format
    • lamp.io contains CSV and NPY readers
    How to get data into lamp

    Use one of the file readers in lamp.io or one of the factories in lamp.STen$.

    How to define a custom neural network layer

    See the documentation on lamp.nn.GenericModule

    How to compose neural network layers

    See the documentation on lamp.nn

    How to train models

    See the training loops in lamp.data.IOLoops

    Definition Classes
    root
  • package autograd

    Implements reverse mode automatic differentiaton

    Implements reverse mode automatic differentiaton

    The main types in this package are lamp.autograd.Variable and lamp.autograd.Op. The computational graph built by this package consists of vertices representing values (as lamp.autograd.Variable) and vertices representing operations (as lamp.autograd.Op).

    Variables contain the value of a Rn => Rm function. Variables may also contain the partial derivative of their argument with respect to a single scalar. A Variable whose value is a scalar (m=1) can trigger the computation of partial derivatives of all the intermediate upstream Variables. Computing partial derivatives with respect to non-scalar variables is not supported.

    A constant Variable may be created with the const or param factory method in this package. const may be used for constants which do not need their partial derivatives to be computed. param on the other hand create Variables which will fill in their partial derivatives. Further variables may be created by the methods in this class, eventually expressing more complex Rn => Rm functions.

    Example
    lamp.Scope.root{ implicit scope =>
      // x is constant (depends on no other variables) and won't compute a partial derivative
      val x = lamp.autograd.const(STen.eye(3, STenOptions.d))
      // y is constant but will compute a partial derivative
      val y = lamp.autograd.param(STen.ones(List(3,3), STenOptions.d))
    
      // z is a Variable with x and y dependencies
      val z = x+y
    
      // w is a Variable with z as a direct and x, y as transient dependencies
      val w = z.sum
      // w is a scalar (number of elements is 1), thus we can call backprop() on it.
      // calling backprop will fill out the partial derivatives of the upstream variables
      w.backprop()
    
      // partialDerivative is empty since we created `x` with `const`
      assert(x.partialDerivative.isEmpty)
    
      // `y`'s partial derivatie is defined and is computed
      // it holds `y`'s partial derivative with respect to `w`, the scalar which we called backprop() on
      assert(y.partialDerivative.isDefined)
    
    }

    This package may be used to compute the derivative of any function, provided the function can be composed out of the provided methods. A particular use case is gradient based optimization.

    See also

    https://arxiv.org/pdf/1811.05031.pdf for a review of the algorithm

    lamp.autograd.Op for how to implement a new operation

  • package data
  • package distributed
  • package extratrees
  • package knn
  • package nn

    Provides building blocks for neural networks

    Provides building blocks for neural networks

    Notable types:

    Optimizers:

    Modules facilitating composing other modules:

    • nn.Sequential composes a homogenous list of modules (analogous to List)
    • nn.sequence composes a heterogeneous list of modules (analogous to tuples)
    • nn.EitherModule composes two modules in a scala.Either

    Examples of neural network building blocks, layers etc:

  • package onnx
  • package saddle
  • package table
  • package umap
  • package util
  • BufferPair
  • CPU
  • ColumnSelection
  • CudaDevice
  • Device
  • DoublePrecision
  • EmptyMovable
  • FloatingPointPrecision
  • HalfPrecision
  • MPS
  • Movable
  • NcclUniqueId
  • STen
  • STenOptions
  • Scope
  • SinglePrecision
  • TensorHelpers
p

lamp

package lamp

Lamp provides utilities to build state of the art machine learning applications

Overview

Notable types and packages:

  • lamp.STen is a memory managed wrapper around aten.ATen, an off the heap, native n-dimensionl array backed by libtorch.
  • lamp.autograd implements reverse mode automatic differentiation.
  • lamp.nn contains neural network building blocks, see e.g. lamp.nn.Linear.
  • lamp.data.IOLoops implements a training loop and other data related abstractions.
  • lamp.knn implements k-nearest neighbor search on the CPU and GPU
  • lamp.umap.Umap implements the UMAP dimension reduction algorithm
  • lamp.onnx implements serialization of computation graphs into ONNX format
  • lamp.io contains CSV and NPY readers
How to get data into lamp

Use one of the file readers in lamp.io or one of the factories in lamp.STen$.

How to define a custom neural network layer

See the documentation on lamp.nn.GenericModule

How to compose neural network layers

See the documentation on lamp.nn

How to train models

See the training loops in lamp.data.IOLoops

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. lamp
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Package Members

  1. package autograd

    Implements reverse mode automatic differentiaton

    Implements reverse mode automatic differentiaton

    The main types in this package are lamp.autograd.Variable and lamp.autograd.Op. The computational graph built by this package consists of vertices representing values (as lamp.autograd.Variable) and vertices representing operations (as lamp.autograd.Op).

    Variables contain the value of a Rn => Rm function. Variables may also contain the partial derivative of their argument with respect to a single scalar. A Variable whose value is a scalar (m=1) can trigger the computation of partial derivatives of all the intermediate upstream Variables. Computing partial derivatives with respect to non-scalar variables is not supported.

    A constant Variable may be created with the const or param factory method in this package. const may be used for constants which do not need their partial derivatives to be computed. param on the other hand create Variables which will fill in their partial derivatives. Further variables may be created by the methods in this class, eventually expressing more complex Rn => Rm functions.

    Example
    lamp.Scope.root{ implicit scope =>
      // x is constant (depends on no other variables) and won't compute a partial derivative
      val x = lamp.autograd.const(STen.eye(3, STenOptions.d))
      // y is constant but will compute a partial derivative
      val y = lamp.autograd.param(STen.ones(List(3,3), STenOptions.d))
    
      // z is a Variable with x and y dependencies
      val z = x+y
    
      // w is a Variable with z as a direct and x, y as transient dependencies
      val w = z.sum
      // w is a scalar (number of elements is 1), thus we can call backprop() on it.
      // calling backprop will fill out the partial derivatives of the upstream variables
      w.backprop()
    
      // partialDerivative is empty since we created `x` with `const`
      assert(x.partialDerivative.isEmpty)
    
      // `y`'s partial derivatie is defined and is computed
      // it holds `y`'s partial derivative with respect to `w`, the scalar which we called backprop() on
      assert(y.partialDerivative.isDefined)
    
    }

    This package may be used to compute the derivative of any function, provided the function can be composed out of the provided methods. A particular use case is gradient based optimization.

    See also

    https://arxiv.org/pdf/1811.05031.pdf for a review of the algorithm

    lamp.autograd.Op for how to implement a new operation

  2. package data
  3. package distributed
  4. package extratrees
  5. package knn
  6. package nn

    Provides building blocks for neural networks

    Provides building blocks for neural networks

    Notable types:

    Optimizers:

    Modules facilitating composing other modules:

    • nn.Sequential composes a homogenous list of modules (analogous to List)
    • nn.sequence composes a heterogeneous list of modules (analogous to tuples)
    • nn.EitherModule composes two modules in a scala.Either

    Examples of neural network building blocks, layers etc:

  7. package onnx
  8. package saddle
  9. package table
  10. package umap
  11. package util

Type Members

  1. case class BufferPair(source: STen, destination: STen) extends Product with Serializable
  2. case class ColumnSelection(e: Either[String, Int]) extends Product with Serializable
  3. case class CudaDevice(i: Int) extends Device with Product with Serializable
  4. sealed trait Device extends AnyRef

    Represents a device where tensors are stored and tensor operations are executed

  5. class EmptyMovable[-R] extends AnyRef
  6. sealed trait FloatingPointPrecision extends AnyRef
  7. trait Movable[-R] extends AnyRef
  8. case class NcclUniqueId(base64: String) extends Product with Serializable
  9. case class STen extends Product with Serializable

    Memory managed, off-heap, GPU and CPU compatible N-dimensional array.

    Memory managed, off-heap, GPU and CPU compatible N-dimensional array.

    This class is a wrapper around aten.Tensor providing a more convenient API. All allocating operations require an implicit lamp.Scope.

    STen instances are associated with a device which determines where the memory is allocated, and where the operations are performed. Operations on multiple tensors expect that all the arguments reside on the same device.

    lamp.STen.options returns a lamp.STenOptions which describes the device, shape, data type and storage layout of a tensor. Most factory methods in the companion object in turn require a lamp.STenOptions to specify the device, data types and storage layout.

    Naming convention of most operations follows libtorch. Operations return their result in a copy, i.e. not in place. These operations need a lamp.Scope. Operations whose name ends with an underscore are in place. Operations whose name contains out will write their results into the specified output tensor, these are in the companion object. Some operations are exempt from this naming rule, e.g. +=, -=, *= etc.

    Semantics of operations follow those of libtorch with the same name. Many of the operations broadcasts. See https://numpy.org/doc/stable/user/basics.broadcasting.html#general-broadcasting-rules for broadcasting rules. In short:

    1. shapes are aligned from the right, extending with ones to the left as needed. 2. If two aligned dimensions are not matching but one of them is 1, then it is expanded to the size of the other dimension, pretending a copy of all its values. If two aligned dimension are not matching and neither of them is 1, then the operation fails.

    Examples

    Scope.root { implicit scope =>
        val sum = Scope { implicit scope =>
        val ident = STen.eye(3, STenOptions.d)
        val ones = STen.ones(List(3, 3), STenOptions.d)
        ident + ones
        }
        assert(sum.toMat == mat.ones(3, 3) + mat.ident(3))
    }
    Broadcasting examples
    // successful
    3 x 4 x 6 A
        4 x 6 B
    3 x 4 x 6 Result // B is repeated 3 times first dimensions
    
    // successful
    3 x 4 x 6 A
    3 x 1 x 6 B
    3 x 4 x 6 Result // B's second dimension is repeated 4 times
    
    // fail
    3 x 4 x 6 A
    3 x 2 x 6 B
    3 x 4 x 6 Result // 2 != 4

    The companion object contains various factories which copy data from the JVM memory to STen tensors.

  10. case class STenOptions(value: TensorOptions) extends Product with Serializable
  11. type Sc[_] = Scope
  12. final class Scope extends AnyRef

    Faciliates memory management of off-heap data structures.

    Faciliates memory management of off-heap data structures.

    Tracks allocations of aten.Tensor and aten.TensorOption instances.

    aten.Tensor and aten.TensorOption instances are not freed up by the garbage collector. Lamp implements zoned memory management around these object. The managed counterpart of aten.Tensor is lamp.STen, while for aten.TensorOption it is lamp.STenOptions.

    One can only create a lamp.STen instance with a lamp.Scope in implicit scope.

    Create new scopes with lamp.Scope.root, lamp.Scope.apply or lamp.Scope.root.

    Examples

    // Scope.root returns Unit
    Scope.root { implicit scope =>
        val sum = Scope { implicit scope =>
        // Intermediate values allocated in this block (`ident` and `ones`) are freed when
        // this block returns
        // The return value (`ident + ones`) of this block is moved to the outer scope
        val ident = STen.eye(3, STenOptions.d)
        val ones = STen.ones(List(3, 3), STenOptions.d)
        ident + ones
        }
        assert(sum.toMat == mat.ones(3, 3) + mat.ident(3))
        // `sum` is freed once this block exits
    }

Value Members

  1. def scope(implicit s: Scope): Scope
  2. def scoped(r: Tensor)(implicit s: Scope): Tensor
  3. object BufferPair extends Serializable
  4. case object CPU extends Device with Product with Serializable
  5. object Device
  6. case object DoublePrecision extends FloatingPointPrecision with Product with Serializable
  7. object EmptyMovable
  8. case object HalfPrecision extends FloatingPointPrecision with Product with Serializable
  9. case object MPS extends Device with Product with Serializable
  10. object Movable
  11. object NcclUniqueId extends Serializable
  12. object STen extends Serializable

    Companion object of lamp.STen

    Companion object of lamp.STen

  13. object STenOptions extends Serializable
  14. object Scope
  15. case object SinglePrecision extends FloatingPointPrecision with Product with Serializable
  16. object TensorHelpers

Inherited from AnyRef

Inherited from Any

Ungrouped