Skip to content

Commit

Permalink
Add kserve-deeploy-shapkernel runtime and example
Browse files Browse the repository at this point in the history
Signed-off-by: Tim Kleinloog <[email protected]>
  • Loading branch information
TimKleinloog committed May 6, 2024
1 parent 59d705b commit bc06686
Show file tree
Hide file tree
Showing 9 changed files with 194 additions and 0 deletions.
34 changes: 34 additions & 0 deletions charts/kserve-resources/templates/clusterservingruntimes.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -373,3 +373,37 @@ spec:
limits:
cpu: "1"
memory: 2Gi
---
apiVersion: serving.kserve.io/v1alpha1
kind: ClusterServingRuntime
metadata:
name: kserve-deeploy-shapkernelserver
spec:
annotations:
prometheus.kserve.io/path: /metrics
prometheus.kserve.io/port: "8080"
supportedModelFormats:
- autoSelect: true
name: shap-kernel
priority: 1
version: "1"
protocolVersions:
- v1
- v2
containers:
- name: kserve-container
image: "{{ .Values.kserve.servingruntime.deeploy-shapkernerserver.image }}:{{ .Values.kserve.servingruntime.deeploy-shapkernerserver.tag }}"
args:
- --model_name={{.Name}}
- --predictor_host={{.Name}}-predictor.{{.Namespace}}
- --model_dir=/mnt/models
- --http_port=8080
- --nthread=1
- --explainer_sub_type=SHAPKernel
resources:
limits:
cpu: "1"
memory: 2Gi
requests:
cpu: "1"
memory: 2Gi
3 changes: 3 additions & 0 deletions charts/kserve-resources/values.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -125,3 +125,6 @@ kserve:
art:
image: kserve/art-explainer
defaultVersion: *defaultVersion
deeploy-shapkernelserver:
image: deeployml/deeploy-shap
tag: v0.1.0
33 changes: 33 additions & 0 deletions config/runtimes/kserve-deeploy-shapkernelserver.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
apiVersion: serving.kserve.io/v1alpha1
kind: ClusterServingRuntime
metadata:
name: kserve-deeploy-shapkernelserver
spec:
annotations:
prometheus.kserve.io/path: /metrics
prometheus.kserve.io/port: "8080"
supportedModelFormats:
- autoSelect: true
name: shap-kernel
priority: 1
version: "1"
protocolVersions:
- v1
- v2
containers:
name: kserve-container
image: kserve-deeploy-shapkernelserver:replace
args:
- --model_name={{.Name}}
- --predictor_host={{.Name}}-predictor.{{.Namespace}}
- --model_dir=/mnt/models
- --http_port=8080
- --nthread=1
- --explainer_sub_type=SHAPKernel
resources:
limits:
cpu: "1"
memory: 2Gi
requests:
cpu: "1"
memory: 2Gi
5 changes: 5 additions & 0 deletions config/runtimes/kustomization.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ resources:
- kserve-lgbserver.yaml
- kserve-torchserve.yaml
- kserve-huggingfaceserver.yaml
- kserve-deeploy-shapkernelserver.yaml

images:
# SMS Only Runtimes
Expand Down Expand Up @@ -54,3 +55,7 @@ images:
- name: huggingfaceserver
newName: kserve/huggingfaceserver
newTag: latest

- name: kserve-deeploy-shapkernelserver
newName: deeployml/deeploy-shap
newTag: v0.1.0
64 changes: 64 additions & 0 deletions docs/samples/v1beta1/explanation/deeploy/shap-kernel/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
## Creating your own model and explainer to test on Sklearn and Deeploy Shap Kenel server.

First we need to generate a simple Sklearn model and Shap kenel explainer using Python.

```shell
python train.py
```

For this example the artifacts are available in Google storage.

## Predict and explain on a InferenceService Scikit-learn and Deeploy Shap Kenel server

## Setup
1. Your ~/.kube/config should point to a cluster with [KServe installed](https://github.com/kserve/kserve#installation).
2. Your cluster's Istio Ingress gateway must be [network accessible](https://istio.io/latest/docs/tasks/traffic-management/ingress/ingress-control/).

## Create the InferenceService

Apply the CRD
```
kubectl apply -f sample.yaml
```

Expected Output
```
$ inferenceservice.serving.kserve.io/deeploy-sample created
```

## Run a prediction
The first step is to [determine the ingress IP and ports](../../../../README.md#determine-the-ingress-ip-and-ports) and set `INGRESS_HOST` and `INGRESS_PORT`

```
MODEL_NAME=deeploy-sample
INPUT_PATH=@./sample-input.json
SERVICE_HOSTNAME=$(kubectl get inferenceservice deeploy-sample -o jsonpath='{.status.url}' | cut -d "/" -f 3)
curl -v -H "Host: ${SERVICE_HOSTNAME}" http://${INGRESS_HOST}:${INGRESS_PORT}/v1/models/$MODEL_NAME:explain -d $INPUT_PATH
```

Expected Output

```
{
"predictions": [
true
],
"explanations": [
{
"shap_values": [
0.01666666666666666,
0,
0.10000000000000013,
-0.15000000000000008,
0.03333333333333335,
-0.4166666666666668,
0,
0,
0.01666666666666672,
0
],
"expected_value": 0.4
}
]
}
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
dill==0.3.6
joblib==1.3.2
scikit-learn==1.3.0
shap==0.42.0

Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"instances": [[51.0, 0.0, 13.0, 2.0, 4.0, 1.0, 12250.0, 500.0, 40.0, 21.0]]
}
21 changes: 21 additions & 0 deletions docs/samples/v1beta1/explanation/deeploy/shap-kernel/sample.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
apiVersion: "serving.kserve.io/v1beta1"
kind: "InferenceService"
metadata:
name: "deeploy-sample"
spec:
predictor:
model:
modelFormat:
name: sklearn
version:
runtime: kserve-sklearnserver
protocolVersion: "v2"
storageUri: "gs://deeploy-examples/sklearn/census/20231128130326/model"
explainer:
model:
modelFormat:
name: shap-kernel
version:
runtime: kserve-deeploy-shapkernelserver
protocolVersion: "v2"
storageUri: "gs://deeploy-examples/sklearn/census/20231128130326/explainer"
26 changes: 26 additions & 0 deletions docs/samples/v1beta1/explanation/deeploy/shap-kernel/train.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
import sklearn
import shap
import dill
import joblib

# Load data
X,y = shap.datasets.adult()

# Filter columns, removing sensitive features
sensitive_features = ["Sex", "Race"]
X = X.drop(columns=sensitive_features)

X_train, X_valid, y_train, y_valid = sklearn.model_selection.train_test_split(X, y, test_size=0.2, random_state=7)

# Train knn
knn = sklearn.neighbors.KNeighborsClassifier()
knn.fit(X_train, y_train)

# Fit Shap Kernel Explainer
f = lambda x: knn.predict_proba(x)[:,1]
med = X_train.median().values.reshape((1,X_train.shape[1]))
explainer = shap.KernelExplainer(f, med)

# Export model and explainer artefacts
joblib.dump(value=knn, filename='./model.joblib')
dill.dump(obj=explainer, file='./explainer.dill')

0 comments on commit bc06686

Please sign in to comment.