ActivationOperationsConverter

Converter for activation function operations.

This converter implements various activation functions using StableHLO primitives:

  • sigmoid: using stablehlo.exponential and arithmetic operations

  • softmax: using stablehlo.reduce and stablehlo.broadcast_in_dim

  • tanh, gelu, swish: using appropriate StableHLO operations

Note: relu is already implemented in LegacyOperationsConverter

Supports operations as specified in Requirements 2.3:

  • Activation functions (relu, sigmoid, softmax)

  • Additional activations (tanh, gelu, swish)

Constructors

Link copied to clipboard
constructor()

Properties

Link copied to clipboard
open override val supportedOperations: Set<String>

Set of operation names this converter supports

Functions

Link copied to clipboard
open override fun convert(node: GraphNode, operands: List<String>, context: ConversionContext): ConversionResult

Convert a graph node to StableHLO operations