This page was generated from notebooks/explainer_examples.ipynb.

Example Model Explanations with Seldon

Seldon core supports various out-of-the-box explainers that leverage the Alibi ML Expalinability open source library.

In this notebook we show how you can use the pre-packaged explainer functionality that simplifies the creation of advanced AI model explainers.

Seldon provides the following out-of-the-box pre-packaged explainers: * Anchor Tabular Explainer * AI Explainer that uses the anchor technique for tabular data * It basically answers the question of what are the most “powerul” or “important” features in a tabular prediction * Anchor Image Explainer * AI Explainer that uses the anchor technique for image data * It basically answers the question of what are the most “powerul” or “important” pixels in an image prediction * Anchor Text Explainer * AI Explainer that uses the anchor technique for text data * It basically answers the question of what are the most “powerul” or “important” tokens in a text prediction * Counterfactual Explainer * AI Explainer that uses the counterfactual technique for any type of data * It basically provides insight of what are the minimum changes you can do to an input to change the prediction to a different class * Contrastive Explainer * AI explainer that uses the Contrastive Explanations technique for any type of data * It basically provides insights of what are the minimum changes you can do to an input to change the prediction to change the prediction or the minimum components of the input to make it the same prediction

Running this notebook

For the ImageNet Model you will need:

This should install the required package dependencies, if not please also install: - Pillow package (pip install Pillow) - matplotlib package (pip install matplotlib) - tensorflow package (pip install tensorflow)

You will also need to start Jupyter with settings to allow for large payloads, for example:

jupyter notebook --NotebookApp.iopub_data_rate_limit=1000000000

Setup Seldon Core

Follow the instructions to Setup Cluster with Ambassador Ingress and Install Seldon Core.

Then port-forward to that ingress on localhost:8003 in a separate terminal either with:

  • Ambassador: kubectl port-forward $(kubectl get pods -n seldon -l app.kubernetes.io/name=ambassador -o jsonpath='{.items[0].metadata.name}') -n seldon 8003:8080

  • Istio: kubectl port-forward $(kubectl get pods -l istio=ingressgateway -n istio-system -o jsonpath='{.items[0].metadata.name}') -n istio-system 8003:80

Create Namespace for experimentation

We will first set up the namespace of Seldon where we will be deploying all our models

[ ]:
!kubectl create namespace seldon

And then we will set the current workspace to use the seldon namespace so all our commands are run there by default (instead of running everything in the default namespace.)

[ ]:
!kubectl config set-context $(kubectl config current-context) --namespace=seldon

Income Prediction Model

[ ]:
%%writefile resources/income_explainer.yaml
apiVersion: machinelearning.seldon.io/v1
kind: SeldonDeployment
metadata:
  name: income
spec:
  name: income
  annotations:
    seldon.io/rest-timeout: "100000"
  predictors:
  - graph:
      children: []
      implementation: SKLEARN_SERVER
      modelUri: gs://seldon-models/sklearn/income/model
      name: classifier
    explainer:
      type: AnchorTabular
      modelUri: gs://seldon-models/sklearn/income/explainer
    name: default
    replicas: 1
[ ]:
!kubectl apply -f resources/income_explainer.yaml
[ ]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=income -o jsonpath='{.items[0].metadata.name}')
[ ]:
!kubectl rollout status deploy/income-default-explainer
[ ]:
from seldon_core.seldon_client import SeldonClient
import numpy as np
sc = SeldonClient(deployment_name="income",namespace="seldon", gateway="ambassador", gateway_endpoint="localhost:8003")

Use python client library to get a prediction.

[ ]:
data = np.array([[39, 7, 1, 1, 1, 1, 4, 1, 2174, 0, 40, 9]])
r = sc.predict(data=data)
print(r.response)

Use curl to get a prediction.

[ ]:
!curl -d '{"data": {"ndarray":[[39, 7, 1, 1, 1, 1, 4, 1, 2174, 0, 40, 9]]}}' \
   -X POST http://localhost:8003/seldon/seldon/income/api/v1.0/predictions \
   -H "Content-Type: application/json"

Use python client library to get an explanation.

[ ]:
data = np.array([[39, 7, 1, 1, 1, 1, 4, 1, 2174, 0, 40, 9]])
explanation = sc.explain(deployment_name="income", predictor="default", data=data)
print(explanation.response["names"])

Using curl to get an explanation.

[ ]:
!curl -X POST -H 'Content-Type: application/json' \
    -d '{"data": {"names": ["text"], "ndarray": [[52,  4,  0,  2,  8,  4,  2,  0,  0,  0, 60, 9]]}}' \
    http://localhost:8003/seldon/seldon/income-explainer/default/api/v1.0/explain | jq ".names"
[ ]:
!kubectl delete -f resources/income_explainer.yaml

Movie Sentiment Model

[ ]:
%%writefile resources/moviesentiment_explainer.yaml
apiVersion: machinelearning.seldon.io/v1
kind: SeldonDeployment
metadata:
  name: movie
spec:
  name: movie
  annotations:
    seldon.io/rest-timeout: "100000"
  predictors:
  - graph:
      children: []
      implementation: SKLEARN_SERVER
      modelUri: gs://seldon-models/sklearn/moviesentiment
      name: classifier
    explainer:
      type: AnchorText
    name: default
    replicas: 1
[ ]:
!kubectl apply -f resources/moviesentiment_explainer.yaml
[ ]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=movie -o jsonpath='{.items[0].metadata.name}')
[ ]:
!kubectl rollout status deploy/movie-default-explainer
[ ]:
from seldon_core.seldon_client import SeldonClient
import numpy as np
sc = SeldonClient(deployment_name="movie", namespace="seldon", gateway_endpoint="localhost:8003", payload_type='ndarray')
[ ]:
!curl -d '{"data": {"ndarray":["This film has great actors"]}}' \
   -X POST http://localhost:8003/seldon/seldon/movie/api/v1.0/predictions \
   -H "Content-Type: application/json"
[ ]:
data = np.array(['this film has great actors'])
r = sc.predict(data=data)
print(r)
assert(r.success==True)
[ ]:
!curl -s -d '{"data": {"ndarray":["This movie has great actors"]}}' \
   -X POST http://localhost:8003/seldon/seldon/movie-explainer/default/api/v1.0/explain \
   -H "Content-Type: application/json" | jq ".names"
[ ]:
data = np.array(['this film has great actors'])
explanation = sc.explain(predictor="default", data=data)
print(explanation.response["names"])
[ ]:
!kubectl delete -f resources/moviesentiment_explainer.yaml

Imagenet Model

[ ]:
%%writefile resources/imagenet_explainer_grpc.yaml
apiVersion: machinelearning.seldon.io/v1
kind: SeldonDeployment
metadata:
  name: image
spec:
  annotations:
    seldon.io/rest-timeout: "10000000"
    seldon.io/grpc-timeout: "10000000"
    seldon.io/grpc-max-message-size: "1000000000"
  name: image
  predictors:
  - componentSpecs:
    - spec:
        containers:
        - image: docker.io/seldonio/imagenet-transformer:0.1
          name: transformer
    graph:
      name: transformer
      type: TRANSFORMER
      endpoint:
        type: GRPC
      children:
      - implementation: TENSORFLOW_SERVER
        modelUri: gs://seldon-models/tfserving/imagenet/model
        name: classifier
        endpoint:
          type: GRPC
        parameters:
          - name: model_name
            type: STRING
            value: classifier
          - name: model_input
            type: STRING
            value: input_image
          - name: model_output
            type: STRING
            value: predictions/Softmax:0
    svcOrchSpec:
      resources:
        requests:
          memory: 10Gi
        limits:
          memory: 10Gi
      env:
      - name: SELDON_LOG_LEVEL
        value: DEBUG
    explainer:
      type: AnchorImages
      modelUri: gs://seldon-models/tfserving/imagenet/explainer
      config:
        batch_size: "100"
      endpoint:
        type: GRPC
    name: default
    replicas: 1
[ ]:
!kubectl apply -f resources/imagenet_explainer_grpc.yaml
[ ]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=image -o jsonpath='{.items[0].metadata.name}')
[ ]:
!kubectl rollout status deploy/image-default-explainer
[ ]:
from PIL import Image
import matplotlib
%matplotlib inline
import matplotlib.pyplot as plt
from tensorflow.keras.applications.inception_v3 import InceptionV3, decode_predictions
import alibi
from alibi.datasets import fetch_imagenet
import numpy as np

def get_image_data():
    data = []
    image_shape = (299, 299, 3)
    target_size = image_shape[:2]
    image = Image.open("cat-raw.jpg").convert('RGB')
    image = np.expand_dims(image.resize(target_size), axis=0)
    data.append(image)
    data = np.concatenate(data, axis=0)
    return data

data = get_image_data()
[ ]:
from seldon_core.seldon_client import SeldonClient
import numpy as np
sc = SeldonClient(
    deployment_name="image",
    namespace="seldon",
    grpc_max_send_message_length= 27 * 1024 * 1024,
    grpc_max_receive_message_length= 27 * 1024 * 1024,
    gateway="ambassador",
    transport="grpc",
    gateway_endpoint="localhost:8003",
    client_return_type='proto')
[ ]:
import tensorflow as tf
data = get_image_data()
req = data[0:1]
r = sc.predict(data=req, payload_type='tftensor')

preds = tf.make_ndarray(r.response.data.tftensor)

label = decode_predictions(preds, top=1)
plt.title(label[0])
plt.imshow(data[0])
[ ]:
req = np.expand_dims(data[0], axis=0)
r = sc.explain(data=req, predictor="default", transport="rest", payload_type='ndarray', client_return_type="dict")
exp_arr = np.array(r.response['anchor'])

f, axarr = plt.subplots(1, 2)
axarr[0].imshow(data[0])
axarr[1].imshow(r.response['anchor'])
plt.show()

[ ]:
!kubectl delete -f resources/imagenet_explainer_grpc.yaml

Tensorflow CIFAR10 Model

[ ]:
%%writefile resources/cifar10_explainer.yaml
apiVersion: machinelearning.seldon.io/v1
kind: SeldonDeployment
metadata:
  name: cifar10-classifier
spec:
  protocol: tensorflow
  annotations:
    seldon.io/rest-timeout: "100000"
  predictors:
  - componentSpecs:
    graph:
      implementation: TENSORFLOW_SERVER
      modelUri: gs://seldon-models/tfserving/cifar10/resnet32
      name: cifar10-classifier
      logger:
         mode: all
    explainer:
      type: AnchorImages
      modelUri: gs://seldon-models/tfserving/cifar10/explainer
    name: default
    replicas: 1
[ ]:
!kubectl apply -f resources/cifar10_explainer.yaml
[ ]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=cifar10-classifier -o jsonpath='{.items[0].metadata.name}')
[ ]:
!kubectl rollout status deploy/cifar10-classifier-default-explainer
[ ]:
import tensorflow as tf
import matplotlib.pyplot as plt
import os

url = 'https://storage.googleapis.com/seldon-models/alibi-detect/classifier/'
path_model = os.path.join(url, "cifar10", "resnet32", 'model.h5')
save_path = tf.keras.utils.get_file("resnet32", path_model)
model = tf.keras.models.load_model(save_path)

train, test = tf.keras.datasets.cifar10.load_data()
X_train, y_train = train
X_test, y_test = test

X_train = X_train.astype('float32') / 255
X_test = X_test.astype('float32') / 255
print(X_train.shape, y_train.shape, X_test.shape, y_test.shape)
class_names = ['airplane', 'automobile', 'bird', 'cat', 'deer',
               'dog', 'frog', 'horse', 'ship', 'truck']
[ ]:
from subprocess import run, Popen, PIPE
import json
import numpy as np
idx=12
test_example=X_test[idx:idx+1].tolist()
payload='{"instances":'+f"{test_example}"+' }'
cmd=f"""curl -d '{payload}' \
   http://localhost:8003/seldon/seldon/cifar10-classifier/v1/models/cifar10-classifier/:predict \
   -H "Content-Type: application/json"
"""
ret = Popen(cmd, shell=True,stdout=PIPE)
raw = ret.stdout.read().decode("utf-8")
print(raw)
res=json.loads(raw)
arr=np.array(res["predictions"])
X = X_test[idx].reshape(1, 32, 32, 3)
plt.imshow(X.reshape(32, 32, 3))
plt.axis('off')
plt.show()
print("class:",class_names[y_test[idx][0]])
print("prediction:",class_names[arr[0].argmax()])
[ ]:
test_example=X_test[idx:idx+1].tolist()
payload='{"instances":'+f"{test_example}"+' }'
cmd=f"""curl -d '{payload}' \
   http://localhost:8003/seldon/seldon/cifar10-classifier-explainer/default/v1/models/cifar10-classifier:explain \
   -H "Content-Type: application/json"
"""
ret = Popen(cmd, shell=True,stdout=PIPE)
raw = ret.stdout.read().decode("utf-8")
explanation = json.loads(raw)
arr = np.array(explanation["anchor"])
plt.imshow(arr)
[ ]: