ONNX Runtime error: node->GetOutputEdgesCount() == 0 was false. Can't remove node

I have a simple Keras RNN model, composed by embedding, LSTM, and linear layers:

loaded_model.layers

Out[23]: 
[<keras.layers.embeddings.Embedding at 0x2275dc1f6a0>,
 <keras.layers.recurrent_v2.LSTM at 0x2275dc8d5b0>,
 <keras.layers.core.dense.Dense at 0x2275dd17730>,
 <keras.layers.core.activation.Activation at 0x2275de3ee80>]

The model works well in Keras when dumped and loaded. I converted the loaded model to ONNX opset 15 using tf2onnx.convert.from_keras, but I get this error when I init the InferenceSession object:

onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: D:\a\_work\1\s\onnxruntime\core\graph\graph.cc:3275 onnxruntime::Graph::RemoveNode node->GetOutputEdgesCount() == 0 was false. Can't remove node sequential/lstm_7/transpose as it still has output edges.

This is the relevant node in Netron:

Netron view of ONNX model

Indeed it has output edges...

I don't want to see this error. Is this some kind of optimization that I can turn off in InferenceSession with disabled_optimizers=...? (this argument is not documented unfortunately)

Thank you.

How many English words
do you know?
Test your English vocabulary size, and measure
how many words do you know
Online Test
Powered by Examplum