Describe the bug
I want to save a model that I trained with tensorflow and that contains Dropout layers. I need these layers at inference to use them in training_mode in order to measure the epistemic uncertainty of my model. I think the tf2onnx convert.from_keras function does not save the dropout layer.
System information
Windows 11
Tensorflow Version: 2.7.0
Python version: 3.8.10
To Reproduce Here is the code example that I use:
import tf2onnx
import onnx
from tensorflow.keras.layers import ( Dense, Input, Dropout,)
from tensorflow.keras.models import Model
input_layer = Input(shape=(96))
h = Dense(128, activation="relu")(input_layer)
h = Dropout(0.1)(h)
h = Dense(128, activation="relu")(h)
h = Dropout(0.1)(h)
h = Dense(12, activation=None)(h)
model = Model(input_layer, h, name=name)
model_onnx, _ =tf2onnx.convert.from_keras(model, output_path='test_dropout.onnx')
test = onnx.load('test_dropout.onnx')
test.graph.node
In the node displayed I have no mention of dropout. The thing is, I want to use the dropout layer in training_mode during inference to use MC dropout to have a measurement of the epistemic uncertainty.
Is there something I am doing wrong here ? Or is it just no possible to do so ?
The output of the test.graph.node
is:
[input: "input_1"
input: "model/dense/MatMul/ReadVariableOp:0"
output: "model/dense/MatMul:0"
name: "model/dense/MatMul"
op_type: "MatMul"
, input: "model/dense/MatMul:0"
output: "model/dense/Relu:0"
name: "model/dense/Relu"
op_type: "Relu"
, input: "model/dense/Relu:0"
input: "model/dense_1/MatMul/ReadVariableOp:0"
output: "model/dense_1/MatMul:0"
name: "model/dense_1/MatMul"
op_type: "MatMul"
, input: "model/dense_1/MatMul:0"
output: "model/dense_1/Relu:0"
name: "model/dense_1/Relu"
op_type: "Relu"
, input: "model/dense_1/Relu:0"
input: "model/dense_2/MatMul/ReadVariableOp:0"
output: "dense_2"
name: "model/dense_2/MatMul"
op_type: "MatMul"
]