You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: index.html
+14-1Lines changed: 14 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,20 @@
10
10
</head>
11
11
<body>
12
12
<divclass="container-fluid">
13
-
<h1>Inference of Scikit-learn model in C++</h1>
13
+
<h1>Inference of scikit-learn models in C++</h1>
14
+
15
+
<h2>Available options</h2>
16
+
<h3>Model persistence and inference libraries</h3>
17
+
<p>
18
+
According to scikit-learn documentation one can save a trained scikit-learn model onto disk in either ONNX format or PMMML format. The saved model can be then easily loaded back in to python for inference. But how can one use such a saved model in a C++ application. In the case of ONNX one could use ONNX runtime. Though loading models saved in PMML format have good support in Java. There aren't any good libraries in C++ that can load a PMML model.
19
+
</p>
20
+
<h3>Converting models into machine code</h3>
21
+
<h3>Using libraries that scikit-learn uses under the hood</h3>
22
+
<h3>Embedding a python interpreter in C++</h3>
23
+
<h2>ONNX and ONNX runtime</h2>
24
+
<h3>Why ONNX?</h3>
25
+
<h2>Toy application</h2>
26
+
<p>I created a toy application to show how one can use ONNX runtime to do scikit-learn model inference in a C/C++ application</p>
0 commit comments