generateActivationFunctionWithAccuracy
Generates C code for activation functions with exact numerical consistency. Matches existing DefaultCpuOps implementations for consistency.
Enhanced for numerical accuracy by:
Using the same mathematical functions as DefaultCpuOps
Implementing consistent handling of edge cases (NaN, infinity)
Ensuring exact transcendental function behavior
Adding input validation for numerical stability
Supported activations:
ReLU: max(0, x) using fmaxf() - matches DefaultCpuOps.relu()
Sigmoid: 1 / (1 + exp(-x)) using expf() - matches DefaultCpuOps.sigmoid()
Tanh: tanh(x) using tanhf() - matches DefaultCpuOps.tanh()
Return
LayerCode containing generated C code fragment
Parameters
node
GraphNode representing an activation function