Object

com.thoughtworks.deeplearning

DifferentiableFloat

Related Doc: package deeplearning

Permalink

object DifferentiableFloat

A namespace of common operators for Float layers.

Author:

杨博 (Yang Bo) <[email protected]>

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DifferentiableFloat
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. final class FloatLayerOps[Input <: Tape] extends AnyRef

    Permalink
  2. implicit final class NativeFloatOps extends AnyRef

    Permalink
  3. trait OptimizerFactory extends AnyRef

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. implicit def Float*Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function *, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        inputFloatLayer * anotherFloatLayer
      }
  5. implicit def Float+Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function +, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.+(inputFloatLayer,anotherFloatLayer)
      }
  6. implicit def Float-Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers. The returned Case is used by the polymorphic function -, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.-(inputFloatLayer,anotherFloatLayer)
      }
  7. implicit def Float/Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function /, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods./(inputFloatLayer,anotherFloatLayer)
      }
  8. object Layers

    Permalink
  9. object OptimizerFactory

    Permalink
  10. object Optimizers

    Permalink

    Optimizers of Float.

    Optimizers of Float.

    Example:
    1. implicit val optimizerFactory = new DifferentiableFloat.OptimizerFactory {
        override def floatOptimizer(weight: Weight): Optimizer = {
          new LearningRate with L2Regularization {
            var learningRate = 0.00003
            override protected def l2Regularization: Float = 0.003
            override protected def currentLearningRate(): Float = {
            learningRate * 0.75
            learningRate
           }
         }
       }
      }
  11. implicit def abs(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts Float Layer for the polymorphic function abs

    Returns a Case that accepts Float Layer for the polymorphic function abs

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.abs(inputFloatLayer)
      }
  12. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  13. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  14. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  15. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  16. implicit def exp(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts Float Layer for the polymorphic function exp

    Returns a Case that accepts Float Layer for the polymorphic function exp

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.exp(inputFloatLayer)
      }
  17. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  18. implicit def floatToLiteral: Aux[Float, Float, Float]

    Permalink
  19. implicit def floatTrainable: Trainable[Float, Float]

    Permalink

    See also

    Trainable

  20. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  21. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  23. implicit def log(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts Float Layer for the polymorphic function log

    Returns a Case that accepts Float Layer for the polymorphic function log

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.log(inputFloatLayer)
      }
  24. implicit def max(Float,Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts two Float Layers for the polymorphic function max

    Returns a Case that accepts two Float Layers for the polymorphic function max

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.max(inputFloatLayer,anotherFloatLayer)
      }
  25. implicit def min(Float,Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Permalink

    Returns a Case that accepts two Float Layers for the polymorphic function min

    Returns a Case that accepts two Float Layers for the polymorphic function min

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.min(inputFloatLayer,anotherFloatLayer)
      }
  26. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  27. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  28. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  29. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  30. implicit def toFloatLayerOps[From, Input <: Tape](from: From)(implicit toLayer: OfPlaceholder[From, Input, FloatPlaceholder]): FloatLayerOps[Input]

    Permalink

    Implicitly converts any layer to FloatLayerOps, which enables common methods for Float layers.

    Implicitly converts any layer to FloatLayerOps, which enables common methods for Float layers.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
  31. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  32. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped