Skip to content

Commit

Permalink
Merge branch 'master' into transformsDocs2
Browse files Browse the repository at this point in the history
  • Loading branch information
sfilipi committed Jul 13, 2018
2 parents 781293d + ceac01f commit db2592d
Show file tree
Hide file tree
Showing 17 changed files with 510 additions and 511 deletions.
7 changes: 3 additions & 4 deletions src/Microsoft.ML/Models/OnnxConverter.cs
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,10 @@ public sealed partial class OnnxConverter
/// <a href="https://onnx.ai/">ONNX</a> is an intermediate representation format
/// for machine learning models. It is used to make models portable such that you can
/// train a model using a toolkit and run it in another tookit's runtime, for example,
/// you can create a model using ML.NET (or any ONNX compatible toolkit), convert it to ONNX and
/// then the ONNX model can be converted into say, CoreML, TensorFlow or WinML model
/// to run on the respective runtime.
/// you can create a model using ML.NET, export it to an ONNX-ML model file,
/// then load and run that ONNX-ML model in Windows ML, on an UWP Windows 10 app.
///
/// This API converts an ML.NET model to ONNX format by inspecting the transform pipeline
/// This API converts an ML.NET model to ONNX-ML format by inspecting the transform pipeline
/// from the end, checking for components that know how to save themselves as ONNX.
/// The first item in the transform pipeline that does not know how to save itself
/// as ONNX, is considered the "input" to the ONNX pipeline. (Ideally this would be the
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,8 @@ TRUTH ||========================
Precision ||1.0000 |0.9310 |0.8966 |
Accuracy(micro-avg): 0.936709
Accuracy(macro-avg): 0.942857
Log-loss: 0.312681
Log-loss reduction: 71.248182
Log-loss: 0.312759
Log-loss reduction: 71.240938

Confusion table
||========================
Expand All @@ -42,8 +42,8 @@ OVERALL RESULTS
---------------------------------------
Accuracy(micro-avg): 0.947228 (0.0105)
Accuracy(macro-avg): 0.947944 (0.0051)
Log-loss: 0.253035 (0.0596)
Log-loss reduction: 76.717466 (5.4693)
Log-loss: 0.253074 (0.0597)
Log-loss reduction: 76.713844 (5.4729)

---------------------------------------
Physical memory usage(MB): %Number%
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
LightGBMMC
Accuracy(micro-avg) Accuracy(macro-avg) Log-loss Log-loss reduction /iter /lr /nl /mil /nt Learner Name Train Dataset Test Dataset Results File Run Time Physical Memory Virtual Memory Command Line Settings
0.947228 0.947944 0.253035 76.71747 10 0.2 20 10 1 LightGBMMC %Data% %Output% 99 0 0 maml.exe CV tr=LightGBMMC{nt=1 iter=10 v=- lr=0.2 mil=10 nl=20} threads=- dout=%Output% loader=Text{col=Label:TX:0 col=Features:1-*} data=%Data% seed=1 xf=Term{col=Label} /iter:10;/lr:0.2;/nl:20;/mil:10;/nt:1
0.947228 0.947944 0.253074 76.71384 10 0.2 20 10 1 LightGBMMC %Data% %Output% 99 0 0 maml.exe CV tr=LightGBMMC{nt=1 iter=10 v=- lr=0.2 mil=10 nl=20} threads=- dout=%Output% loader=Text{col=Label:TX:0 col=Features:1-*} data=%Data% seed=1 xf=Term{col=Label} /iter:10;/lr:0.2;/nl:20;/mil:10;/nt:1

148 changes: 74 additions & 74 deletions test/BaselineOutput/SingleDebug/LightGBMMC/LightGBMMC-CV-iris.key.txt
Original file line number Diff line number Diff line change
@@ -1,83 +1,83 @@
Instance Label Assigned Log-loss #1 Score #2 Score #3 Score #1 Class #2 Class #3 Class
5 0 0 0.25328578422472414 0.776246 0.1675262 0.0562277846 0 1 2
6 0 0 0.12225769559664824 0.8849203 0.0591422766 0.05593741 0 2 1
8 0 0 0.13903052119099127 0.870201468 0.07016017 0.05963834 0 1 2
9 0 0 0.13970851121881944 0.8696117 0.07047898 0.0599093363 0 1 2
10 0 0 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
11 0 0 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
18 0 0 0.25328578422472414 0.776246 0.1675262 0.0562277846 0 1 2
20 0 0 0.25328578422472414 0.776246 0.1675262 0.0562277846 0 1 2
21 0 0 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
25 0 0 0.13970851121881944 0.8696117 0.07047898 0.0599093363 0 1 2
28 0 0 0.12225769559664824 0.8849203 0.0591422766 0.05593741 0 2 1
31 0 0 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
32 0 0 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
35 0 0 0.13903052119099127 0.870201468 0.07016017 0.05963834 0 1 2
37 0 0 0.13970851121881944 0.8696117 0.07047898 0.0599093363 0 1 2
40 0 0 0.12225769559664824 0.8849203 0.0591422766 0.05593741 0 2 1
41 0 0 0.17550956509619134 0.8390294 0.09255582 0.0684148148 0 1 2
44 0 0 0.25328578422472414 0.776246 0.1675262 0.0562277846 0 1 2
45 0 0 0.13903052119099127 0.870201468 0.07016017 0.05963834 0 1 2
46 0 0 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
48 0 0 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
50 1 1 0.48031316690941278 0.61858964 0.2931589 0.08825144 1 2 0
51 1 1 0.18552267596609509 0.83067 0.09896274 0.07036729 1 0 2
52 1 2 1.8686139310181762 0.745523036 0.154337436 0.1001395 2 1 0
54 1 1 0.45819815025419125 0.632422149 0.3149078 0.0526700951 1 2 0
56 1 1 0.58631345356437459 0.5563746 0.357763946 0.0858614147 1 2 0
60 1 1 0.54904634529954011 0.5775003 0.363432646 0.0590670444 1 0 2
63 1 1 0.442888085987238 0.6421791 0.304338247 0.0534826852 1 2 0
64 1 1 0.14288655580917453 0.8668524 0.06818299 0.06496459 1 2 0
66 1 1 0.13927185898584951 0.8699915 0.06910439 0.060904108 1 2 0
68 1 1 0.1475586146516118 0.862811863 0.08110718 0.0560809337 1 2 0
69 1 1 0.13690026149264065 0.8720572 0.07104707 0.056895718 1 2 0
70 1 1 0.58631345356437459 0.5563746 0.357763946 0.0858614147 1 2 0
71 1 1 0.15194427686527462 0.859036148 0.07716796 0.06379592 1 2 0
72 1 2 1.4639003870351257 0.712372541 0.231332228 0.0562952235 2 1 0
73 1 1 0.45819815025419125 0.632422149 0.3149078 0.0526700951 1 2 0
74 1 1 0.13796619253742226 0.871128142 0.06828712 0.0605847277 1 2 0
76 1 1 0.45819815025419125 0.632422149 0.3149078 0.0526700951 1 2 0
77 1 2 2.0734221020246566 0.815010846 0.1257547 0.05923444 2 1 0
79 1 1 0.54904634529954011 0.5775003 0.363432646 0.0590670444 1 0 2
82 1 1 0.13641919697507263 0.8724768 0.07081407 0.0567091331 1 2 0
88 1 1 0.13407533925580511 0.8745242 0.06425438 0.0612214245 1 2 0
90 1 1 0.1425659799992052 0.867130339 0.0762954 0.0565742739 1 2 0
91 1 1 0.442888085987238 0.6421791 0.304338247 0.0534826852 1 2 0
92 1 1 0.13641919697507263 0.8724768 0.07081407 0.0567091331 1 2 0
93 1 1 0.54904634529954011 0.5775003 0.363432646 0.0590670444 1 0 2
95 1 1 0.13407533925580511 0.8745242 0.06425438 0.0612214245 1 2 0
96 1 1 0.13407533925580511 0.8745242 0.06425438 0.0612214245 1 2 0
97 1 1 0.13796619253742226 0.871128142 0.06828712 0.0605847277 1 2 0
98 1 1 0.54904634529954011 0.5775003 0.363432646 0.0590670444 1 0 2
99 1 1 0.13879074815854195 0.870410144 0.06865643 0.06093342 1 2 0
100 2 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
102 2 2 0.12282263476045074 0.8844205 0.0591996 0.056379877 2 1 0
104 2 2 0.12282263476045074 0.8844205 0.0591996 0.056379877 2 1 0
105 2 2 0.12282263476045074 0.8844205 0.0591996 0.056379877 2 1 0
106 2 1 2.3434392875794119 0.8476237 0.09599691 0.05637939 1 2 0
108 2 2 0.22657594234978759 0.7972588 0.1479769 0.0547643229 2 1 0
109 2 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
111 2 2 0.177848875720656 0.8370689 0.108788572 0.0541424938 2 1 0
5 0 0 0.25328601458204941 0.776245832 0.167526156 0.0562280267 0 1 2
6 0 0 0.12225796502047259 0.884920061 0.05914253 0.0559373945 0 2 1
8 0 0 0.13903079517192601 0.87020123 0.07016016 0.0596385933 0 1 2
9 0 0 0.13970878538557358 0.869611442 0.07047896 0.05990959 0 1 2
10 0 0 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
11 0 0 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
18 0 0 0.25328601458204941 0.776245832 0.167526156 0.0562280267 0 1 2
20 0 0 0.25328601458204941 0.776245832 0.167526156 0.0562280267 0 1 2
21 0 0 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
25 0 0 0.13970878538557358 0.869611442 0.07047896 0.05990959 0 1 2
28 0 0 0.12225796502047259 0.884920061 0.05914253 0.0559373945 0 2 1
31 0 0 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
32 0 0 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
35 0 0 0.13903079517192601 0.87020123 0.07016016 0.0596385933 0 1 2
37 0 0 0.13970878538557358 0.869611442 0.07047896 0.05990959 0 1 2
40 0 0 0.12225796502047259 0.884920061 0.05914253 0.0559373945 0 2 1
41 0 0 0.17550530270540357 0.839032948 0.09255621 0.068410866 0 1 2
44 0 0 0.25328601458204941 0.776245832 0.167526156 0.0562280267 0 1 2
45 0 0 0.13903079517192601 0.87020123 0.07016016 0.0596385933 0 1 2
46 0 0 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
48 0 0 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
50 1 1 0.48031528673730972 0.6185883 0.2931604 0.0882512555 1 2 0
51 1 1 0.1855230347406718 0.8306697 0.09896271 0.0703676 1 0 2
52 1 2 1.8686158620047173 0.7455236 0.154337138 0.100139305 2 1 0
54 1 1 0.45820041221338154 0.6324207 0.3149093 0.05266998 1 2 0
56 1 1 0.58631409634709242 0.556374252 0.357764423 0.08586135 1 2 0
60 1 1 0.5490426296940486 0.5775024 0.363434 0.0590636 1 0 2
63 1 1 0.44289031357940112 0.642177641 0.3043398 0.053482566 1 2 0
64 1 1 0.14288683084862894 0.866852164 0.06818328 0.06496458 1 2 0
66 1 1 0.13881478450725465 0.8703892 0.0686788 0.0609319545 1 2 0
68 1 1 0.14755364077036032 0.862816155 0.0811026245 0.0560812131 1 2 0
69 1 1 0.13689581878715465 0.8720611 0.07104298 0.0568959676 1 2 0
70 1 1 0.58631409634709242 0.556374252 0.357764423 0.08586135 1 2 0
71 1 1 0.15245577872775815 0.858596861 0.07763986 0.0637633 1 2 0
72 1 2 1.4638898231048283 0.7123695 0.231334671 0.05629582 2 1 0
73 1 1 0.45820041221338154 0.6324207 0.3149093 0.05266998 1 2 0
74 1 1 0.13842123633682313 0.870731831 0.068711035 0.06055716 1 2 0
76 1 1 0.45820041221338154 0.6324207 0.3149093 0.05266998 1 2 0
77 1 2 2.0734225760002563 0.815010965 0.12575464 0.0592344068 2 1 0
79 1 1 0.5490426296940486 0.5775024 0.363434 0.0590636 1 0 2
82 1 1 0.13641482472258923 0.872480631 0.07081 0.0567093827 1 2 0
88 1 1 0.13407561188247241 0.874523938 0.06425465 0.0612214059 1 2 0
90 1 1 0.14206322054986673 0.8675664 0.07583086 0.0566027239 1 2 0
91 1 1 0.44289031357940112 0.642177641 0.3043398 0.053482566 1 2 0
92 1 1 0.13641482472258923 0.872480631 0.07081 0.0567093827 1 2 0
93 1 1 0.5490426296940486 0.5775024 0.363434 0.0590636 1 0 2
95 1 1 0.13407561188247241 0.874523938 0.06425465 0.0612214059 1 2 0
96 1 1 0.13407561188247241 0.874523938 0.06425465 0.0612214059 1 2 0
97 1 1 0.13842123633682313 0.870731831 0.068711035 0.06055716 1 2 0
98 1 1 0.5490426296940486 0.5775024 0.363434 0.0590636 1 0 2
99 1 1 0.13879102207379132 0.8704099 0.06865671 0.0609334 1 2 0
100 2 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
102 2 2 0.12282263476045074 0.8844205 0.0591995977 0.0563798733 2 1 0
104 2 2 0.12282263476045074 0.8844205 0.0591995977 0.0563798733 2 1 0
105 2 2 0.12282263476045074 0.8844205 0.0591995977 0.0563798733 2 1 0
106 2 1 2.3492272871651649 0.84814316 0.09544288 0.05641394 1 2 0
108 2 2 0.22657586758781198 0.797258854 0.14797686 0.0547643043 2 1 0
109 2 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
111 2 2 0.177848875720656 0.8370689 0.108788565 0.0541424938 2 1 0
112 2 2 0.13281455464792449 0.875627458 0.06831084 0.0560617261 2 1 0
113 2 2 0.19621674447868781 0.8218341 0.12273933 0.05542656 2 1 0
113 2 2 0.1962166719523184 0.821834147 0.122739322 0.0554265566 2 1 0
115 2 2 0.17200937673419167 0.8419713 0.09234353 0.0656852052 2 0 1
117 2 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
120 2 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
121 2 2 0.16411842591849909 0.8486415 0.09412396 0.05723452 2 1 0
117 2 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
120 2 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
121 2 2 0.16411842591849909 0.8486415 0.09412395 0.0572345145 2 1 0
122 2 2 0.13716672321276682 0.871824861 0.07206305 0.0561121143 2 1 0
123 2 2 0.28256671512014453 0.753846347 0.189867079 0.05628657 2 1 0
125 2 2 0.20564890133993838 0.814118862 0.09413585 0.09174529 2 0 1
123 2 2 0.28256655698542577 0.753846467 0.18986699 0.0562865436 2 1 0
125 2 2 0.20564882812625254 0.8141189 0.09413582 0.09174526 2 0 1
128 2 2 0.13716672321276682 0.871824861 0.07206305 0.0561121143 2 1 0
129 2 2 0.16567795334648433 0.847319067 0.09548671 0.057194218 2 1 0
131 2 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
129 2 2 0.16567788300150446 0.8473191 0.09548668 0.0571942 2 1 0
131 2 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
132 2 2 0.13716672321276682 0.871824861 0.07206305 0.0561121143 2 1 0
133 2 2 0.29113037794713281 0.7474182 0.191831991 0.0607497729 2 1 0
137 2 2 0.22116862531406531 0.8015815 0.104995139 0.09342336 2 1 0
138 2 1 0.99148905684440769 0.5769956 0.3710238 0.05198058 1 2 0
141 2 2 0.18520119392899573 0.8309371 0.09454043 0.07452248 2 0 1
144 2 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
145 2 2 0.14505497806361808 0.864974737 0.07757514 0.0574501 2 1 0
147 2 2 0.14505497806361808 0.864974737 0.07757514 0.0574501 2 1 0
133 2 2 0.29112966022097542 0.747418761 0.191831574 0.06074964 2 1 0
137 2 2 0.22116855095526036 0.801581562 0.1049951 0.09342333 2 1 0
138 2 1 0.99148777165233459 0.5769952 0.371024281 0.05198054 1 2 0
141 2 2 0.18520119392899573 0.8309371 0.09454043 0.07452247 2 0 1
144 2 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
145 2 2 0.14505497806361808 0.864974737 0.07757514 0.0574500971 2 1 0
147 2 2 0.14505497806361808 0.864974737 0.07757514 0.0574500971 2 1 0
0 0 0 0.12805244799353907 0.879807234 0.0614267066 0.0587660335 0 1 2
1 0 0 0.13227381045206946 0.8761011 0.06422711 0.0596717857 0 1 2
2 0 0 0.12805244799353907 0.879807234 0.0614267066 0.0587660335 0 1 2
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@ TRUTH ||========================================
Precision ||1.0000 |0.9310 |0.8966 |0.0000 |0.0000 |
Accuracy(micro-avg): 0.936709
Accuracy(macro-avg): 0.942857
Log-loss: 0.312681
Log-loss reduction: 71.248176
Log-loss: 0.312759
Log-loss reduction: 71.240931

Confusion table
||========================================
Expand All @@ -46,8 +46,8 @@ OVERALL RESULTS
---------------------------------------
Accuracy(micro-avg): 0.947228 (0.0105)
Accuracy(macro-avg): 0.947944 (0.0051)
Log-loss: 0.253035 (0.0596)
Log-loss reduction: 76.717461 (5.4693)
Log-loss: 0.253074 (0.0597)
Log-loss reduction: 76.713839 (5.4729)

---------------------------------------
Physical memory usage(MB): %Number%
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
LightGBMMC
Accuracy(micro-avg) Accuracy(macro-avg) Log-loss Log-loss reduction /iter /lr /nl /mil /nt Learner Name Train Dataset Test Dataset Results File Run Time Physical Memory Virtual Memory Command Line Settings
0.947228 0.947944 0.253035 76.71746 10 0.2 20 10 1 LightGBMMC %Data% %Output% 99 0 0 maml.exe CV tr=LightGBMMC{nt=1 iter=10 v=- lr=0.2 mil=10 nl=20} threads=- dout=%Output% loader=Text{col=Label:U4[0-4]:0 col=Features:1-4} data=%Data% seed=1 /iter:10;/lr:0.2;/nl:20;/mil:10;/nt:1
0.947228 0.947944 0.253074 76.71384 10 0.2 20 10 1 LightGBMMC %Data% %Output% 99 0 0 maml.exe CV tr=LightGBMMC{nt=1 iter=10 v=- lr=0.2 mil=10 nl=20} threads=- dout=%Output% loader=Text{col=Label:U4[0-4]:0 col=Features:1-4} data=%Data% seed=1 /iter:10;/lr:0.2;/nl:20;/mil:10;/nt:1

Loading

0 comments on commit db2592d

Please sign in to comment.