Packages

c

lamp.autograd

LayerNormOp

case class LayerNormOp(scope: Scope, input: Variable, weight: Variable, bias: Variable, normalizedShape: List[Long], eps: Double) extends Op with Product with Serializable

Linear Supertypes
Serializable, Product, Equals, Op, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. LayerNormOp
  2. Serializable
  3. Product
  4. Equals
  5. Op
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new LayerNormOp(scope: Scope, input: Variable, weight: Variable, bias: Variable, normalizedShape: List[Long], eps: Double)

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. val bias: Variable
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
  7. val eps: Double
  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  10. val input: Variable
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. val joinedBackward: Option[(STen) => Unit]
    Definition Classes
    Op
  13. val mean: Tensor
  14. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. val normalizedShape: List[Long]
  16. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  17. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  18. val output: Tensor
  19. val params: List[(Variable, (STen, STen) => Unit)]

    Implementation of the backward pass

    Implementation of the backward pass

    A list of input variables paired up with an anonymous function computing the respective partial derivative. With the notation in the documentation of the trait lamp.autograd.Op: dy/dw2 => dy/dw2 * dw2/dw1. The first argument of the anonymous function is the incoming partial derivative (dy/dw2), the second argument is the output tensor into which the result (dy/dw2 * dw2/dw1) is accumulated (added).

    If the operation does not support computing the partial derivative for some of its arguments, then do not include that argument in this list.

    Definition Classes
    LayerNormOpOp
    See also

    The documentation on the trait lamp.autograd.Op for more details and example.

  20. def productElementNames: Iterator[String]
    Definition Classes
    Product
  21. val rstd: Tensor
  22. val scope: Scope
  23. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  24. val value: Variable

    The value of this operation

    The value of this operation

    Definition Classes
    LayerNormOpOp
  25. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  26. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  27. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  28. val weight: Variable

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable]) @Deprecated
    Deprecated

    (Since version 9)

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from Op

Inherited from AnyRef

Inherited from Any

Ungrouped