why is it much slower than pytorch inference?

#2
by purejomo - opened

I expected that inferencing with onnxruntime should be faster than normal pytorch inferencing.
but it is much slower than pytorch inf.
I wonder how it could be.

Sign up or log in to comment