Skip to content
This repository was archived by the owner on Mar 26, 2019. It is now read-only.
This repository was archived by the owner on Mar 26, 2019. It is now read-only.

Output from MXNet model differs from sample from onnx/models #42

@ThomasDelteil

Description

@ThomasDelteil

Hi,

I downloaded a model (inceptionv2) from the onnx/models repo, and loaded it in mxnet. I run the sample input data through it and got an output that differs from the output in models folder. Is that a known-issue?

That is the notebook I am using:
https://github.com/ThomasDelteil/Gluon_ONNX/blob/master/Fine-tuning_ONNX.ipynb

The code is reproduced below in case there is some obvious mistake:

import numpy as np
import onnx
import onnx_mxnet
import mxnet as mx
from collections import namedtuple

inception_v2 = "https://s3.amazonaws.com/download.onnx/models/inception_v2.tar.gz"
model_links = [inception_v2]
model_folder = "model"
model_name = "inception_v2"

# Download the models
for link in model_links:
    !mkdir -p $model_folder
    !wget -P $model_folder $link -nc -nv
# Extract the chosen model
!tar -xzf $model_folder/*.tar.gz -C $model_folder

#Helper function to load sample data
def load_sample(index=0):
    numpy_path = "{}/test_data_{}.npz".format(model_path, index)
    sample = np.load(numpy_path, encoding='bytes')
    inputs = sample['inputs'][0]
    outputs = sample['outputs'][0]
    return inputs, outputs

# load the model in MXNet using `onnx-mxnet`

# Set the file-paths
model_path = "{}/{}".format(model_folder, model_name)
onnx_path = "{}/model.onnx".format(model_path)

#Load the model and sample inputs and outputs
sym, params = onnx_mxnet.import_model(onnx_path)

# We pick the mxnet compute context:
ctx = mx.cpu()

# Get some sample data to infer the shapes
inputs, outputs = load_sample(0)

# By default, 'input_0' is an input of the imported model.
mod = mx.mod.Module(symbol=sym, data_names=['input_0'], context=ctx, label_names=None)
mod.bind(for_training=False, data_shapes=[('input_0', inputs.shape)], label_shapes=None)
mod.set_params(arg_params=params, aux_params=params, allow_missing=False, allow_extra=False)

#Test the model using sample data
Batch = namedtuple('Batch', ['data'])

inputs, outputs = load_sample(0)

# forward on the provided data batch
mod.forward(Batch([mx.nd.array(inputs)]))
model_output = mod.get_outputs()

print(model_output[0][0][0])
[  2.84355319e-05]
<NDArray 1 @cpu(0)>
print(outputs[0][0])
0.00016256621

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions