pytorch_pfn_extras.onnx.export_testcase

pytorch_pfn_extras.onnx.export_testcase(model, args, out_dir, *, output_grad=False, metadata=True, model_overwrite=True, strip_large_tensor_data=False, large_tensor_threshold=100, return_output=False, user_meta=None, export_torch_script=False, export_torch_trace=False, **kwargs)

Export model and I/O tensors of the model in protobuf format.

Parameters
  • output_grad (bool or Tensor) – If True, this function will output model’s gradient with names ‘gradient_%d.pb’. If set Tensor, use it as gradient input. The gradient inputs are output as ‘gradient_input_%d.pb’ along with gradient.

  • metadata (bool) – If True, output meta information taken from git log.

  • model_overwrite (bool) – If False and model.onnx has already existed, only export input/output data as another test dataset.

  • strip_large_tensor_data (bool) – If True, this function will strip data of large tensors to reduce ONNX file size for benchmarking

  • large_tensor_threshold (int) – If number of elements of tensor is larger than this value, the tensor is stripped when strip_large_tensor_data is True

  • return_output (bool) – If True, return output values come from the model.

  • export_torch_script (bool) – Output model_script.pt using torch.jit.script

  • export_torch_trace (bool) – Output model_trace.pt using torch.jit.trace

Warning

This function is not thread safe.