Packages

  • package root
    Definition Classes
    root
  • package lamp

    Lamp provides utilities to build state of the art machine learning applications

    Lamp provides utilities to build state of the art machine learning applications

    Overview

    Notable types and packages:

    • lamp.STen is a memory managed wrapper around aten.ATen, an off the heap, native n-dimensionl array backed by libtorch.
    • lamp.autograd implements reverse mode automatic differentiation.
    • lamp.nn contains neural network building blocks, see e.g. lamp.nn.Linear.
    • lamp.data.IOLoops implements a training loop and other data related abstractions.
    • lamp.knn implements k-nearest neighbor search on the CPU and GPU
    • lamp.umap.Umap implements the UMAP dimension reduction algorithm
    • lamp.onnx implements serialization of computation graphs into ONNX format
    • lamp.io contains CSV and NPY readers
    How to get data into lamp

    Use one of the file readers in lamp.io or one of the factories in lamp.STen$.

    How to define a custom neural network layer

    See the documentation on lamp.nn.GenericModule

    How to compose neural network layers

    See the documentation on lamp.nn

    How to train models

    See the training loops in lamp.data.IOLoops

    Definition Classes
    root
  • package autograd

    Implements reverse mode automatic differentiaton

    Implements reverse mode automatic differentiaton

    The main types in this package are lamp.autograd.Variable and lamp.autograd.Op. The computational graph built by this package consists of vertices representing values (as lamp.autograd.Variable) and vertices representing operations (as lamp.autograd.Op).

    Variables contain the value of a Rn => Rm function. Variables may also contain the partial derivative of their argument with respect to a single scalar. A Variable whose value is a scalar (m=1) can trigger the computation of partial derivatives of all the intermediate upstream Variables. Computing partial derivatives with respect to non-scalar variables is not supported.

    A constant Variable may be created with the const or param factory method in this package. const may be used for constants which do not need their partial derivatives to be computed. param on the other hand create Variables which will fill in their partial derivatives. Further variables may be created by the methods in this class, eventually expressing more complex Rn => Rm functions.

    Example
    lamp.Scope.root{ implicit scope =>
      // x is constant (depends on no other variables) and won't compute a partial derivative
      val x = lamp.autograd.const(STen.eye(3, STenOptions.d))
      // y is constant but will compute a partial derivative
      val y = lamp.autograd.param(STen.ones(List(3,3), STenOptions.d))
    
      // z is a Variable with x and y dependencies
      val z = x+y
    
      // w is a Variable with z as a direct and x, y as transient dependencies
      val w = z.sum
      // w is a scalar (number of elements is 1), thus we can call backprop() on it.
      // calling backprop will fill out the partial derivatives of the upstream variables
      w.backprop()
    
      // partialDerivative is empty since we created `x` with `const`
      assert(x.partialDerivative.isEmpty)
    
      // `y`'s partial derivatie is defined and is computed
      // it holds `y`'s partial derivative with respect to `w`, the scalar which we called backprop() on
      assert(y.partialDerivative.isDefined)
    
    }

    This package may be used to compute the derivative of any function, provided the function can be composed out of the provided methods. A particular use case is gradient based optimization.

    Definition Classes
    lamp
    See also

    https://arxiv.org/pdf/1811.05031.pdf for a review of the algorithm

    lamp.autograd.Op for how to implement a new operation

  • package data
    Definition Classes
    lamp
  • package bytesegmentencoding

    Greedy contraction of consecutive n-grams

  • package distributed
  • package schemas
  • BatchStream
  • BufferedImageHelper
  • Codec
  • CodecFactory
  • DataParallel
  • EmptyBatch
  • EndStream
  • GraphBatchStream
  • IOLoops
  • IdentityCodec
  • IdentityCodecFactory
  • LoopState
  • NonEmptyBatch
  • Peek
  • Reader
  • SWA
  • SWALoopState
  • SimpleLoopState
  • SimpleThenSWALoopState
  • StateIO
  • StreamControl
  • TensorLogger
  • Text
  • TrainingCallback
  • ValidationCallback
  • Writer
  • package distributed
    Definition Classes
    lamp
  • package extratrees
    Definition Classes
    lamp
  • package knn
    Definition Classes
    lamp
  • package nn

    Provides building blocks for neural networks

    Provides building blocks for neural networks

    Notable types:

    Optimizers:

    Modules facilitating composing other modules:

    • nn.Sequential composes a homogenous list of modules (analogous to List)
    • nn.sequence composes a heterogeneous list of modules (analogous to tuples)
    • nn.EitherModule composes two modules in a scala.Either

    Examples of neural network building blocks, layers etc:

    Definition Classes
    lamp
  • package onnx
    Definition Classes
    lamp
  • package saddle
    Definition Classes
    lamp
  • package table
    Definition Classes
    lamp
  • package umap
    Definition Classes
    lamp
  • package util
    Definition Classes
    lamp
p

lamp

data

package data

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. Protected

Package Members

  1. package bytesegmentencoding

    Greedy contraction of consecutive n-grams

  2. package distributed
  3. package schemas

Type Members

  1. trait BatchStream[+I, S, C] extends AnyRef

    A functional stateful stream of items

    A functional stateful stream of items

    lamp's training loops work from data presented in BatchStreams.

    An instance of BatchStream is an description of the data stream, it does not by itself allocates or stores any data. The stream needs to be driven by an interpreter. lamp.data.IOLoops and the companion object BatchStream contain those interpreters to make something useful with a BatchStream.

    See the abstract members and the companion object for more documentation.

    I

    the item type , the stream will yield items of this type

    S

    the state type, the stream will carry over and accumulate state of this type

    C

    type of accessory resources (e.g. buffers), the stream might need an instance of this type for its working. The intended use for fixed, pre-allocated pinned buffer pairs to facilitate host-device copies. See lamp.Device.toBatched and lamp.BufferPair.

  2. trait Codec extends AnyRef

    An abstraction around byte to token encodings.

  3. trait CodecFactory[T <: Codec] extends AnyRef

    An abstraction around byte to token encodings.

  4. sealed trait LoopState extends AnyRef
  5. case class NonEmptyBatch[I](batch: I) extends StreamControl[I] with Product with Serializable
  6. case class Peek(label: String) extends Module with Product with Serializable
  7. case class SWALoopState(model: Seq[STen], optimizer: Seq[STen], epoch: Int, lastValidationLoss: Option[Double], minValidationLoss: Option[Double], numberOfAveragedModels: Int, averagedModels: Option[Seq[Tensor]], learningCurve: List[(Int, Double, Option[Double])]) extends LoopState with Product with Serializable
  8. case class SimpleLoopState(model: Seq[STen], optimizer: Seq[STen], epoch: Int, lastValidationLoss: Option[Double], minValidationLoss: Option[Double], minValidationLossModel: Option[(Int, Seq[Tensor])], learningCurve: List[(Int, Double, Option[(Double, Double)])]) extends LoopState with Product with Serializable
  9. case class SimpleThenSWALoopState(simple: SimpleLoopState, swa: Option[SWALoopState]) extends LoopState with Product with Serializable
  10. sealed trait StreamControl[+I] extends AnyRef
  11. case class TensorLogger(stop: () => Unit) extends Product with Serializable

    Class holding a lambda to stop the logging.

    Class holding a lambda to stop the logging. See its companion object. See lamp.data.TensorLogger#start

  12. trait TrainingCallback extends AnyRef
  13. trait ValidationCallback extends AnyRef

Value Members

  1. object BatchStream
  2. object BufferedImageHelper
  3. object DataParallel
  4. case object EmptyBatch extends StreamControl[Nothing] with Product with Serializable
  5. case object EndStream extends StreamControl[Nothing] with Product with Serializable
  6. object GraphBatchStream
  7. object IOLoops

    Contains a training loops and helpers around it

    Contains a training loops and helpers around it

    The two training loops implemented here are:

  8. object IdentityCodec extends Codec
  9. object IdentityCodecFactory extends CodecFactory[IdentityCodec.type]
  10. object Reader
  11. object SWA
  12. object StateIO

    Helpers to read and write training loop state

  13. object StreamControl
  14. object TensorLogger extends Serializable

    Utility to periodically log active tensors See lamp.data.TensorLogger#start

  15. object Text
  16. object TrainingCallback
  17. object ValidationCallback
  18. object Writer

    Serializes tensors

    Serializes tensors

    This format is similar to the ONNX external tensor serialization format, but it uses JSON rather then protobuf.

    Format specification

    Sequences of tensors are serialized into a JSON descriptor and a data blob. The schema of the descriptor is the case class lamp.data.schemas.TensorList. The location field in this schema holds a path to the data blob. If this is a relative POSIX path then it is relative to the file path where the descriptor itself is written. Otherwise it is an absolute path of the data blob file.

    The descriptor may be embedded into larger JSON structures.

    The data blob itself is the raw data in little endian byte order. Floating point is IEEE-754. The descriptor specifies the byte offset and byte length of the tensors inside the data blob. As such, the data blob contains no framing or other control bytes, but it may contain padding bytes between tensors.

Ungrouped