ONNX exported model outputs different value per inference call for the same input

I used BertModel in my pytorch model, and appended a few layers for classification purpose.

I exported the pytorch model into ONNX model, it works but then the output value for the same model input produces different output value every time. It feels like that the model is in training mode but it is not. I exported the model using the following code

torch.onnx.export(
            model,
            (
                my_inputs1,
                my_inputs2,
            ),
            "model.onnx",
            input_names=['my_inputs1','my_inputs2'],
            output_names=['outputs'],
            dynamic_axes={
                'my_inputs1': {0: 'batch'},
                'my_inputs2': {0: 'batch'},
            },
            opset_version=11,
            do_constant_folding=True,
            enable_onnx_checker=True
        )

Updated reproducible Colab