case class BinaryCrossEntropyWithLogitsLoss(scope: Scope, input: Variable, target: STen, posWeights: Option[STen], reduction: Reduction) extends Op with Product with Serializable
input: (N,T) where T>=1 are multiple independent tasks target: same shape as input, float with in [0,1] posWeight: is (T)
- Alphabetic
- By Inheritance
- BinaryCrossEntropyWithLogitsLoss
- Serializable
- Product
- Equals
- Op
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
Value Members
- val input: Variable
- val joinedBackward: Option[(STen) => Unit]
- Definition Classes
- Op
- val params: List[(Variable, (STen, STen) => Unit)]
Implementation of the backward pass
Implementation of the backward pass
A list of input variables paired up with an anonymous function computing the respective partial derivative. With the notation in the documentation of the trait lamp.autograd.Op:
dy/dw2 => dy/dw2 * dw2/dw1
. The first argument of the anonymous function is the incoming partial derivative (dy/dw2
), the second argument is the output tensor into which the result (dy/dw2 * dw2/dw1
) is accumulated (added).If the operation does not support computing the partial derivative for some of its arguments, then do not include that argument in this list.
- Definition Classes
- BinaryCrossEntropyWithLogitsLoss → Op
- See also
The documentation on the trait lamp.autograd.Op for more details and example.
- val posWeights: Option[STen]
- def productElementNames: Iterator[String]
- Definition Classes
- Product
- val reduction: Reduction
- val scope: Scope
- val target: STen
- val value: Variable
The value of this operation
The value of this operation
- Definition Classes
- BinaryCrossEntropyWithLogitsLoss → Op
- val value1: Tensor