Skip to content

Commit

Permalink
Update to OnnxRuntime 1.6.0 and fixed bug with sequences outputs (#5529)
Browse files Browse the repository at this point in the history
* Use onnx prerelease

* Upgrade to onnx 1.6.0

* Updated docs

* Fixed problem with sequences
  • Loading branch information
antoniovs1029 authored Dec 11, 2020
1 parent 3e72d19 commit 5038e81
Show file tree
Hide file tree
Showing 3 changed files with 9 additions and 3 deletions.
2 changes: 1 addition & 1 deletion eng/Versions.props
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
<GoogleProtobufPackageVersion>3.10.1</GoogleProtobufPackageVersion>
<LightGBMPackageVersion>2.2.3</LightGBMPackageVersion>
<MicrosoftExtensionsPackageVersion>2.1.0</MicrosoftExtensionsPackageVersion>
<MicrosoftMLOnnxRuntimePackageVersion>1.5.2</MicrosoftMLOnnxRuntimePackageVersion>
<MicrosoftMLOnnxRuntimePackageVersion>1.6.0</MicrosoftMLOnnxRuntimePackageVersion>
<MlNetMklDepsPackageVersion>0.0.0.9</MlNetMklDepsPackageVersion>
<ParquetDotNetPackageVersion>2.1.3</ParquetDotNetPackageVersion>
<SystemDrawingCommonPackageVersion>4.5.0</SystemDrawingCommonPackageVersion>
Expand Down
2 changes: 1 addition & 1 deletion src/Microsoft.ML.OnnxTransformer/OnnxTransform.cs
Original file line number Diff line number Diff line change
Expand Up @@ -768,7 +768,7 @@ public NamedOnnxValue GetNamedOnnxValue()
/// | Does this estimator need to look at the data to train its parameters? | No |
/// | Input column data type | Known-sized vector of <xref:System.Single> or <xref:System.Double> types |
/// | Output column data type | As specified by the ONNX model |
/// | Required NuGet in addition to Microsoft.ML | Microsoft.ML.OnnxTransformer (always), either Microsoft.ML.OnnxRuntime 1.5.2 (for CPU processing) or Microsoft.ML.OnnxRuntime.Gpu 1.5.2 (for GPU processing if GPU is available) |
/// | Required NuGet in addition to Microsoft.ML | Microsoft.ML.OnnxTransformer (always), either Microsoft.ML.OnnxRuntime 1.6.0 (for CPU processing) or Microsoft.ML.OnnxRuntime.Gpu 1.6.0 (for GPU processing if GPU is available) |
/// | Exportable to ONNX | No |
///
/// To create this estimator use the following APIs:
Expand Down
8 changes: 7 additions & 1 deletion src/Microsoft.ML.OnnxTransformer/OnnxTypeParser.cs
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,13 @@ private class CastHelper

public static IEnumerable<TDst> CastOnnxSequenceToIEnumerable<TSrc, TDst>(IEnumerable<TSrc> o, Func<TSrc, object> caster)
{
return o.Select(v => (TDst)caster(v));
// Since now we're disposing the NamedOnnxValue objects
// after running inference on each output, we need
// to copy (enumerate) the output through ".ToList()"
// else, if our users try the keep the past sequence
// outputs of their OnnxTransformer, they would
// end up with empty sequences.
return o.Select(v => (TDst)caster(v)).ToList();
}
}

Expand Down

0 comments on commit 5038e81

Please sign in to comment.