Skip to content

Commit 2597569

Browse files
committed
Updated the landing page.
1 parent c1e610f commit 2597569

File tree

1 file changed

+14
-1
lines changed

1 file changed

+14
-1
lines changed

index.html

Lines changed: 14 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,20 @@
1010
</head>
1111
<body>
1212
<div class="container-fluid">
13-
<h1>Inference of Scikit-learn model in C++</h1>
13+
<h1>Inference of scikit-learn models in C++</h1>
14+
15+
<h2>Available options</h2>
16+
<h3>Model persistence and inference libraries</h3>
17+
<p>
18+
According to scikit-learn documentation one can save a trained scikit-learn model onto disk in either ONNX format or PMMML format. The saved model can be then easily loaded back in to python for inference. But how can one use such a saved model in a C++ application. In the case of ONNX one could use ONNX runtime. Though loading models saved in PMML format have good support in Java. There aren't any good libraries in C++ that can load a PMML model.
19+
</p>
20+
<h3>Converting models into machine code</h3>
21+
<h3>Using libraries that scikit-learn uses under the hood</h3>
22+
<h3>Embedding a python interpreter in C++</h3>
23+
<h2>ONNX and ONNX runtime</h2>
24+
<h3>Why ONNX?</h3>
25+
<h2>Toy application</h2>
26+
<p>I created a toy application to show how one can use ONNX runtime to do scikit-learn model inference in a C/C++ application</p>
1427
<img src="src/assets/screenshot.gif" alt="">
1528
</div>
1629
<!-- JavaScript Bundle with Popper -->

0 commit comments

Comments
 (0)