This page was generated from notebooks/server_examples.ipynb.
Example Model Servers with Seldon¶
Follow docs to install Seldon Core.
[1]:
!kubectl create namespace seldon || echo "Already exists"
Error from server (AlreadyExists): namespaces "seldon" already exists
Already exists
[2]:
!kubectl config set-context $(kubectl config current-context) --namespace=seldon
Context "kind-kind" modified.
[3]:
import json
Serve SKLearn Iris Model¶
In order to deploy SKLearn artifacts, we can leverage the pre-packaged SKLearn inference server. The exposed API can follow either:
The default Seldon protocol.
The V2 protocol.
Default Seldon protocol¶
To deploy and start serving an SKLearn artifact using Seldon’s default protocol, we can use a config like the one below:
[4]:
%%writefile ../servers/sklearnserver/samples/iris.yaml
apiVersion: machinelearning.seldon.io/v1alpha2
kind: SeldonDeployment
metadata:
name: sklearn
spec:
predictors:
- graph:
name: classifier
implementation: SKLEARN_SERVER
modelUri: gs://seldon-models/v1.19.0-dev/sklearn/iris
name: default
replicas: 1
svcOrchSpec:
env:
- name: SELDON_LOG_LEVEL
value: DEBUG
Overwriting ../servers/sklearnserver/samples/iris.yaml
We can then apply it to deploy it to our Kubernetes cluster.
[5]:
!kubectl apply -f ../servers/sklearnserver/samples/iris.yaml
seldondeployment.machinelearning.seldon.io/sklearn created
[6]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=sklearn -o jsonpath='{.items[0].metadata.name}')
Waiting for deployment "sklearn-default-0-classifier" rollout to finish: 0 of 1 updated replicas are available...
deployment "sklearn-default-0-classifier" successfully rolled out
Once it’s deployed we can send our sklearn model requests
REST Requests¶
[7]:
X=!curl -s -d '{"data": {"ndarray":[[1.0, 2.0, 5.0, 6.0]]}}' \
-X POST http://localhost:8004/seldon/seldon/sklearn/api/v1.0/predictions \
-H "Content-Type: application/json"
d=json.loads(X[0])
print(d)
{'data': {'names': ['t:0', 't:1', 't:2'], 'ndarray': [[9.912315378486718e-07, 0.0007015931307743852, 0.9992974156376878]]}, 'meta': {'requestPath': {'classifier': 'seldonio/sklearnserver:1.18.0'}}}
[8]:
from seldon_core.seldon_client import SeldonClient
sc = SeldonClient(deployment_name="sklearn", namespace="seldon")
2024-01-10 16:09:43.411334: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2024-01-10 16:09:43.448748: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2024-01-10 16:09:43.527779: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2024-01-10 16:09:43.528677: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-01-10 16:09:44.193744: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
[9]:
r = sc.predict(gateway="istio", transport="rest", shape=(1, 4))
print(r)
assert r.success == True
Success:True message:
Request:
meta {
}
data {
tensor {
shape: 1
shape: 4
values: 0.17141173176497349
values: 0.5937726115406986
values: 0.3305595820782128
values: 0.2046623631664879
}
}
Response:
{'data': {'names': ['t:0', 't:1', 't:2'], 'tensor': {'shape': [1, 3], 'values': [0.41804496555456905, 0.3930727629239069, 0.1888822715215241]}}, 'meta': {'requestPath': {'classifier': 'seldonio/sklearnserver:1.18.0'}}}
gRPC Requests¶
[10]:
r = sc.predict(gateway="istio", transport="grpc", shape=(1, 4))
print(r)
assert r.success == True
Success:True message:
Request:
{'meta': {}, 'data': {'tensor': {'shape': [1, 4], 'values': [0.39156951217983293, 0.9277883706101252, 0.965137780400803, 0.7889296899083068]}}}
Response:
{'meta': {'requestPath': {'classifier': 'seldonio/sklearnserver:1.18.0'}}, 'data': {'names': ['t:0', 't:1', 't:2'], 'tensor': {'shape': [1, 3], 'values': [0.17859883395119414, 0.2418372547184021, 0.5795639113304039]}}}
[11]:
X=!cd ../executor/proto && grpcurl -d '{"data":{"ndarray":[[1.0,2.0,5.0,6.0]]}}' \
-rpc-header seldon:sklearn -rpc-header namespace:seldon \
-plaintext \
-proto ./prediction.proto 0.0.0.0:8004 seldon.protos.Seldon/Predict
d=json.loads("".join(X))
print(d)
{'meta': {'requestPath': {'classifier': 'seldonio/sklearnserver:1.18.0'}}, 'data': {'names': ['t:0', 't:1', 't:2'], 'ndarray': [[9.912315378486718e-07, 0.0007015931307743852, 0.9992974156376878]]}}
And delete the model we deployed
[12]:
!kubectl delete -f ../servers/sklearnserver/samples/iris.yaml
seldondeployment.machinelearning.seldon.io "sklearn" deleted
V2 protocol¶
For example, we can consider the config below:
[13]:
%%writefile ./resources/iris-sklearn-v2.yaml
apiVersion: machinelearning.seldon.io/v1
kind: SeldonDeployment
metadata:
name: sklearn
spec:
name: iris
protocol: v2
predictors:
- graph:
children: []
implementation: SKLEARN_SERVER
modelUri: gs://seldon-models/sklearn/iris-0.23.2/lr_model
name: classifier
name: default
replicas: 1
Overwriting ./resources/iris-sklearn-v2.yaml
We can then apply it to deploy our model to our Kubernetes cluster.
[14]:
!kubectl apply -f resources/iris-sklearn-v2.yaml
seldondeployment.machinelearning.seldon.io/sklearn created
[15]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=sklearn -o jsonpath='{.items[0].metadata.name}')
deployment "sklearn-default-0-classifier" successfully rolled out
Once it’s deployed, we can send inference requests to our model. Note that, since it’s using the V2 Protocol, these requests will be different to the ones using the default Seldon Protocol.
[16]:
import json
import requests
inference_request = {
"inputs": [
{"name": "predict", "shape": [1, 4], "datatype": "FP32", "data": [[1, 2, 3, 4]]}
]
}
endpoint = "http://localhost:8004/seldon/seldon/sklearn/v2/models/infer"
response = requests.post(endpoint, json=inference_request)
print(json.dumps(response.json(), indent=2))
assert response.ok
{
"model_name": "classifier",
"model_version": "v1",
"id": "99c6d14e-4e8b-4458-b41c-6d2922c544a4",
"parameters": {},
"outputs": [
{
"name": "predict",
"shape": [
1,
1
],
"datatype": "INT64",
"parameters": {
"content_type": "np"
},
"data": [
2
]
}
]
}
Finally, we can delete the model we deployed.
[17]:
!kubectl delete -f resources/iris-sklearn-v2.yaml
seldondeployment.machinelearning.seldon.io "sklearn" deleted
Serve XGBoost Iris Model¶
In order to deploy XGBoost models, we can leverage the pre-packaged XGBoost inference server. The exposed API can follow either:
The default Seldon protocol.
The V2 protocol
Default Seldon protocol¶
We can deploy a XGBoost model uploaded to an object store by using the XGBoost model server implementation as shown in the config below:
[18]:
%%writefile resources/iris.yaml
apiVersion: machinelearning.seldon.io/v1
kind: SeldonDeployment
metadata:
name: xgboost
spec:
name: iris
predictors:
- graph:
children: []
implementation: XGBOOST_SERVER
modelUri: gs://seldon-models/xgboost/iris
name: classifier
name: default
replicas: 1
Overwriting resources/iris.yaml
And then we apply it to deploy it to our kubernetes cluster
[19]:
!kubectl apply -f resources/iris.yaml
seldondeployment.machinelearning.seldon.io/xgboost created
[20]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=xgboost -o jsonpath='{.items[0].metadata.name}')
Waiting for deployment "xgboost-default-0-classifier" rollout to finish: 0 of 1 updated replicas are available...
deployment "xgboost-default-0-classifier" successfully rolled out
Rest Requests¶
[21]:
X=!curl -s -d '{"data": {"ndarray":[[1.0, 2.0, 5.0, 6.0]]}}' \
-X POST http://localhost:8004/seldon/seldon/xgboost/api/v1.0/predictions \
-H "Content-Type: application/json"
d=json.loads(X[0])
print(d)
{'data': {'names': [], 'ndarray': [2.0]}, 'meta': {'requestPath': {'classifier': 'seldonio/xgboostserver:1.18.0'}}}
[22]:
from seldon_core.seldon_client import SeldonClient
sc = SeldonClient(deployment_name="xgboost", namespace="seldon")
[23]:
r = sc.predict(gateway="istio", transport="rest", shape=(1, 4))
print(r)
assert r.success == True
Success:True message:
Request:
meta {
}
data {
tensor {
shape: 1
shape: 4
values: 0.5059863560669698
values: 0.6819976066125761
values: 0.8325416286359293
values: 0.2531200764967788
}
}
Response:
{'data': {'names': [], 'tensor': {'shape': [1], 'values': [0.0]}}, 'meta': {'requestPath': {'classifier': 'seldonio/xgboostserver:1.18.0'}}}
gRPC Requests¶
[24]:
r = sc.predict(gateway="istio", transport="grpc", shape=(1, 4))
print(r)
assert r.success == True
Success:True message:
Request:
{'meta': {}, 'data': {'tensor': {'shape': [1, 4], 'values': [0.3402484582438542, 0.7075322398328269, 0.2491942933114213, 0.08837167380879685]}}}
Response:
{'meta': {'requestPath': {'classifier': 'seldonio/xgboostserver:1.18.0'}}, 'data': {'tensor': {'shape': [1], 'values': [0.0]}}}
[25]:
X=!cd ../executor/proto && grpcurl -d '{"data":{"ndarray":[[1.0,2.0,5.0,6.0]]}}' \
-rpc-header seldon:xgboost -rpc-header namespace:seldon \
-plaintext \
-proto ./prediction.proto 0.0.0.0:8004 seldon.protos.Seldon/Predict
d=json.loads("".join(X))
print(d)
{'meta': {'requestPath': {'classifier': 'seldonio/xgboostserver:1.18.0'}}, 'data': {'ndarray': [2]}}
And delete the model we deployed
[26]:
!kubectl delete -f resources/iris.yaml
seldondeployment.machinelearning.seldon.io "xgboost" deleted
V2 protocol¶
We can deploy a XGBoost model, exposing an API compatible with v2 protocol by specifying the protocol
of our SeldonDeployment
as v2
. For example, we can consider the config below:
[27]:
%%writefile ./resources/iris-xgboost-v2.yaml
apiVersion: machinelearning.seldon.io/v1
kind: SeldonDeployment
metadata:
name: xgboost
spec:
name: iris
protocol: v2
predictors:
- graph:
children: []
implementation: XGBOOST_SERVER
modelUri: gs://seldon-models/xgboost/iris
name: iris
name: default
replicas: 1
Overwriting ./resources/iris-xgboost-v2.yaml
We can then apply it to deploy our model to our Kubernetes cluster.
[28]:
!kubectl apply -f ./resources/iris-xgboost-v2.yaml
seldondeployment.machinelearning.seldon.io/xgboost created
[29]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=xgboost -o jsonpath='{.items[0].metadata.name}')
Waiting for deployment "xgboost-default-0-iris" rollout to finish: 0 of 1 updated replicas are available...
deployment "xgboost-default-0-iris" successfully rolled out
Once it’s deployed, we can send inference requests to our model. Note that, since it’s using the V2 Protocol, these requests will be different to the ones using the default Seldon Protocol.
[30]:
import json
import requests
inference_request = {
"inputs": [
{"name": "predict", "shape": [1, 4], "datatype": "FP32", "data": [[1, 2, 3, 4]]}
]
}
endpoint = "http://localhost:8004/seldon/seldon/xgboost/v2/models/infer"
response = requests.post(endpoint, json=inference_request)
print(json.dumps(response.json(), indent=2))
assert response.ok
{
"model_name": "iris",
"model_version": "v0.1.0",
"id": "3cab5b54-2ab7-4fe4-b9a2-7dd0f5346e2f",
"parameters": {},
"outputs": [
{
"name": "predict",
"shape": [
1,
1
],
"datatype": "FP32",
"parameters": {
"content_type": "np"
},
"data": [
2.0
]
}
]
}
Finally, we can delete the model we deployed.
[31]:
!kubectl delete -f ./resources/iris-xgboost-v2.yaml
seldondeployment.machinelearning.seldon.io "xgboost" deleted
Serve Tensorflow MNIST Model¶
We can deploy a tensorflow model uploaded to an object store by using the tensorflow model server implementation as the config below.
This notebook contains two examples, one which shows how you can use the TFServing prepackaged serve with the Seldon Protocol, and a second one which shows how you can deploy it using the tensorlfow protocol (so you can send requests of the exact format as you would to a tfserving server).
Serve Tensorflow MNIST Model with Seldon Protocol¶
The config file below shows how you can deploy your Tensorflow model which exposes the Seldon protocol.
[32]:
%%writefile ./resources/mnist_rest.yaml
apiVersion: machinelearning.seldon.io/v1alpha2
kind: SeldonDeployment
metadata:
name: tfserving
spec:
name: mnist
predictors:
- graph:
children: []
implementation: TENSORFLOW_SERVER
modelUri: gs://seldon-models/tfserving/mnist-model
name: mnist-model
parameters:
- name: signature_name
type: STRING
value: predict_images
- name: model_name
type: STRING
value: mnist-model
- name: model_input
type: STRING
value: images
- name: model_output
type: STRING
value: scores
name: default
replicas: 1
Overwriting ./resources/mnist_rest.yaml
[33]:
!kubectl apply -f ./resources/mnist_rest.yaml
seldondeployment.machinelearning.seldon.io/tfserving created
[34]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=tfserving -o jsonpath='{.items[0].metadata.name}')
Waiting for deployment "tfserving-default-0-mnist-model" rollout to finish: 0 of 1 updated replicas are available...
deployment "tfserving-default-0-mnist-model" successfully rolled out
[35]:
from seldon_core.seldon_client import SeldonClient
sc = SeldonClient(deployment_name="tfserving", namespace="seldon")
REST Request¶
[36]:
r = sc.predict(gateway="istio", transport="rest", shape=(1, 784))
print(r)
assert r.success == True
Success:True message:
Request:
meta {
}
data {
tensor {
shape: 1
shape: 784
values: 0.40042030856645994
values: 0.12706617370273143
values: 0.5995010949374973
values: 0.20673616604277123
values: 0.267965079749103
values: 0.28732441282638177
values: 0.05431953974251236
values: 0.9386931014283844
values: 0.7783730270524005
values: 0.33817086214875713
values: 0.29145765680083213
values: 0.44285879591906574
values: 0.6551587968022324
values: 0.31238417218286163
values: 0.4256316305209036
values: 0.6511456989870854
values: 0.5006871383565527
values: 0.4686249869286161
values: 0.12155327610393096
values: 0.37477288407329035
values: 0.8231317995094778
values: 0.48279144040735
values: 0.6486814039080503
values: 0.4618496078306792
values: 0.849852223594817
values: 0.5694862005802025
values: 0.6927834049569376
values: 0.7441208124168749
values: 0.06791460371138147
values: 0.9385293032052238
values: 0.603400825306626
values: 0.43896138616652847
values: 0.9672913419920942
values: 0.21631413220109796
values: 0.5624466080944204
values: 0.6750025201131201
values: 0.6854662639423265
values: 0.7436185407965809
values: 0.6677067769398716
values: 0.6598140034790021
values: 0.13240513166054046
values: 0.633350182215756
values: 0.07309764459984713
values: 0.989834783786183
values: 0.7767938966676395
values: 0.16704509222933184
values: 0.9165170740033651
values: 0.7743588637498594
values: 0.7277921690932967
values: 0.22823755344717733
values: 0.5927319172617582
values: 0.5937748370912145
values: 0.3219925151589419
values: 0.9579033278660997
values: 0.28320148661035516
values: 0.872925428322407
values: 0.865093114698652
values: 0.5671200938859204
values: 0.12342215796631395
values: 0.06168027942989884
values: 0.6780360733848022
values: 0.8079706943064379
values: 0.6648363401411947
values: 0.8578632362252419
values: 0.45327664580190163
values: 0.27039311457857096
values: 0.850734695860167
values: 0.6400356495942985
values: 0.44086734994340404
values: 0.9085392096493083
values: 0.9387785408710769
values: 0.05571670867151002
values: 0.9714553267086583
values: 0.21347772294930212
values: 0.8413313599741823
values: 0.8776074201494235
values: 0.20205946696815646
values: 0.8707582071093695
values: 0.8333971708656888
values: 0.440933290602714
values: 0.6980336488502751
values: 0.0219863543248483
values: 0.3479315642981834
values: 0.3100423418670978
values: 0.2968803467810235
values: 0.30026491909707265
values: 0.6277191155910714
values: 0.9536906397948098
values: 0.611456796041938
values: 0.4616559053755851
values: 0.19575826825285014
values: 0.591124872245425
values: 0.7067071306936854
values: 0.8245028009034754
values: 0.05822402473383448
values: 0.6157630413853799
values: 0.5331841496412985
values: 0.07554345777138294
values: 0.8388045326190345
values: 0.08870066374900332
values: 0.30818530735029104
values: 0.7158552028314802
values: 0.1318903095824845
values: 0.8922383206055012
values: 0.33285494815574446
values: 0.6021481820386826
values: 0.07130092973637814
values: 0.6161669869892523
values: 0.2030726363106533
values: 0.974045411471201
values: 0.8680698351956628
values: 0.10232485528932944
values: 0.8194611272236905
values: 0.5234414990860082
values: 0.17170186082177075
values: 0.7994065593233911
values: 0.6289850123162115
values: 0.019747368723527003
values: 0.26680186552632046
values: 0.8145561398671172
values: 0.9300140417400974
values: 0.21942326201647644
values: 0.19556710163344515
values: 0.5569141899721448
values: 0.026953087074983118
values: 0.9770301735642452
values: 0.35943429019066697
values: 0.9909260086099312
values: 0.8308169708412381
values: 0.2794303020430591
values: 0.08152887055711289
values: 0.18519346178942941
values: 0.14762334726604898
values: 0.08971954159526685
values: 0.6551884850738449
values: 0.2364049751575481
values: 0.37617151093229784
values: 0.10473742722250079
values: 0.5215420182273054
values: 0.8254766317620303
values: 0.7745397016285644
values: 0.7846155305314919
values: 0.23070499971530345
values: 0.5844807187531911
values: 0.21151741168966098
values: 0.15653637092911743
values: 0.21468271332623778
values: 0.8547827116293403
values: 0.5981178793325721
values: 0.8262320780914909
values: 0.6703655289437207
values: 0.524453183814789
values: 0.3934225651989858
values: 0.9416966381299064
values: 0.5811030870398775
values: 0.4765457781687924
values: 0.7054038843311344
values: 0.4912701885425007
values: 0.7293071941114737
values: 0.3336296925586394
values: 0.12441832852067647
values: 0.9517614095473921
values: 0.030040326613105406
values: 0.250959234636025
values: 0.1999091442701898
values: 0.3852968632350193
values: 0.12592039081304396
values: 0.6252057420138347
values: 0.7925361422448846
values: 0.9165487379883502
values: 0.8308962518891186
values: 0.039817331823081226
values: 0.21380368006304507
values: 0.1583080982136792
values: 0.6305214240852576
values: 0.6189030994237334
values: 0.4976717437268079
values: 0.8313710648022805
values: 0.5915351636227794
values: 0.28915631455191615
values: 0.7845509839705243
values: 0.09746071957552371
values: 0.097863413711336
values: 0.15189928864630997
values: 0.6580121350821213
values: 0.24183766340877955
values: 0.8503592591471376
values: 0.03272873691446754
values: 0.9848374389529514
values: 0.6439869945183643
values: 0.8701438291344454
values: 0.5762291996842376
values: 0.8895312481482751
values: 0.5628070673664918
values: 0.16564273205496827
values: 0.7549801969543055
values: 0.813412353612258
values: 0.42662655466993804
values: 0.42237054956524245
values: 0.07719457062109736
values: 0.27319734760625247
values: 0.42582519057388113
values: 0.34300376573350466
values: 0.8000311972930885
values: 0.9400938719667916
values: 0.7962891022280469
values: 0.7371664001593162
values: 0.42157502284408
values: 0.4360178324706838
values: 0.9929158828870517
values: 0.11561084469848448
values: 0.8666571576358613
values: 0.062377416289506216
values: 0.33129251474184795
values: 0.13614016463516265
values: 0.576546060689628
values: 0.12011448851657724
values: 0.3912762818209131
values: 0.464614728780658
values: 0.971788414970606
values: 0.7448830274363585
values: 0.9958691017579306
values: 0.7081347089809419
values: 0.35384179612923816
values: 0.8739494130262287
values: 0.83618837517398
values: 0.16542630625280363
values: 0.30224596671593085
values: 0.2695858275529318
values: 0.32593719735560167
values: 0.15829297968282874
values: 0.8521750132452464
values: 0.3422193832053866
values: 0.5662959173092238
values: 0.08196185839519565
values: 0.49417103206253976
values: 0.1537211214582349
values: 0.5596101344163842
values: 0.1657085526327683
values: 0.5007841816263026
values: 0.3666794128116245
values: 0.6846960934156174
values: 0.1552404955922062
values: 0.7343942390191002
values: 0.5765799840287049
values: 0.5113249777365446
values: 0.87034044190804
values: 0.5823765643031866
values: 0.04627214068516983
values: 0.7810548133139147
values: 0.01753533408912078
values: 0.096927926658813
values: 0.41016903120551196
values: 0.06078575911314077
values: 0.11969686783063993
values: 0.586455785644092
values: 0.9255547652186291
values: 0.19064087541557428
values: 0.4875379565750708
values: 0.11213834814526191
values: 0.9580928814650108
values: 0.7085325708556545
values: 0.7412298471193591
values: 0.3644935040828432
values: 0.0006192185756462854
values: 0.4400432866151328
values: 0.6889195554796107
values: 0.8945746096174412
values: 0.4798242522575198
values: 0.09958267919381858
values: 0.012857704052786967
values: 0.14174375988805166
values: 0.6742597812587166
values: 0.6244963503821598
values: 0.3642739089096455
values: 0.14961956462302473
values: 0.046813425733917446
values: 0.49765787099527614
values: 0.7242949536815119
values: 0.8717630353327681
values: 0.8683544918378224
values: 0.8667543075744667
values: 0.680130965126705
values: 0.1039627814042754
values: 0.7878586530898906
values: 0.8192856191382673
values: 0.2942029135953579
values: 0.9335051265027072
values: 0.6686956925368511
values: 0.021721340093141528
values: 0.6225903238193273
values: 0.9828618558042794
values: 0.5010703445721822
values: 0.9669009644872706
values: 0.48196995549163213
values: 0.37447900865291195
values: 0.8307960524375816
values: 0.8722264568762788
values: 0.9884095215757212
values: 0.25201667739800815
values: 0.24150074373440544
values: 0.6492639780755063
values: 0.8863131440263702
values: 0.09080820531570566
values: 0.6817779800308623
values: 0.7769255218398078
values: 0.31776966919778626
values: 0.43797574005306017
values: 0.466931435896639
values: 0.8447017214272248
values: 0.406730452752176
values: 0.0758613008415393
values: 0.6337477952983234
values: 0.31600857835067997
values: 0.5449147364425556
values: 0.4490493198475377
values: 0.6352541079951081
values: 0.8849812114375933
values: 0.9564827102983281
values: 0.8161793027818625
values: 0.32515277036591994
values: 0.0954601597284962
values: 0.58659084978464
values: 0.0998137217340983
values: 0.599328695816969
values: 0.2628118434190768
values: 0.5030803654424614
values: 0.23296009075014634
values: 0.9582794437676396
values: 0.31431922227524534
values: 0.5788766567455906
values: 0.583391371636573
values: 0.23621334023155238
values: 0.40988872058756876
values: 0.8463561176995359
values: 0.33045724436369694
values: 0.9899427774621905
values: 0.08520500290284372
values: 0.8001442839886551
values: 0.09509287457915572
values: 0.05622248664454588
values: 0.9401051446887039
values: 0.2574454445948382
values: 0.07243860137123292
values: 0.7724268969500797
values: 0.5296648446226829
values: 0.7332147769459911
values: 0.991774755432883
values: 0.13832957481754038
values: 0.0880271062731195
values: 0.09541082399953604
values: 0.684777629491355
values: 0.6693717010313193
values: 0.026147757712143616
values: 0.7363846410068133
values: 0.2548413381588627
values: 0.022995870370375204
values: 0.1971132229371908
values: 0.9431898781162932
values: 0.7582372875741431
values: 0.7753855104196299
values: 0.7439775612720075
values: 0.7604726634740231
values: 0.4363790478025168
values: 0.8479855184668654
values: 0.29366645266407854
values: 0.43561080136327746
values: 0.846565613238752
values: 0.2925152851851528
values: 0.7697038455704354
values: 0.6747583813708982
values: 0.8768384575177233
values: 0.3585415296208958
values: 0.1043492132207845
values: 0.3766833472407186
values: 0.6144382338578266
values: 0.9348049944593937
values: 0.6475214364707088
values: 0.029183268830357934
values: 0.6015693697898158
values: 0.716428498148026
values: 0.12190166677705205
values: 0.4556608109559179
values: 0.5373331840653155
values: 0.728373754268816
values: 0.6804231255370042
values: 0.9451388300116323
values: 0.1722216140695435
values: 0.009948777272247744
values: 0.8329147017989859
values: 0.5516677998853604
values: 0.8056310918709919
values: 0.44811919584521076
values: 0.30122143891103514
values: 0.9080281488159193
values: 0.03877458570730219
values: 0.3787702372788333
values: 0.28071869929325477
values: 0.25620167512864345
values: 0.12745880794607234
values: 0.5969820993473214
values: 0.5970567710753318
values: 0.8435352184614504
values: 0.29558886901624215
values: 0.9359074938305227
values: 0.9158761550854784
values: 0.5873288706157358
values: 0.38295350417552676
values: 0.9628229593385594
values: 0.8076658079676605
values: 0.23158532103092366
values: 0.6570531458076208
values: 0.45518886677894377
values: 0.3264674398850339
values: 0.2444242889378342
values: 0.5141989266437995
values: 0.16148265153881247
values: 0.4440348675381278
values: 0.6211048138794073
values: 0.6565708139642651
values: 0.5468574198276173
values: 0.8854563372675569
values: 0.4663424781448757
values: 0.6734371980179013
values: 0.11061509039002304
values: 0.07284361752105195
values: 0.06394653698165287
values: 0.3890494124994379
values: 0.9509399277743682
values: 0.45574396979553167
values: 0.11373269661891205
values: 0.25471418971507687
values: 0.4755528167131128
values: 0.4697597332586574
values: 0.8889197090048873
values: 0.024286269446541997
values: 0.43244843729521665
values: 0.33208076770733486
values: 0.9619102796551526
values: 0.49476933194730943
values: 0.012578191369302716
values: 0.39279810223257927
values: 0.7743688455507176
values: 0.8833985502307768
values: 0.7756056605865619
values: 0.5271751328383691
values: 0.9565587994623225
values: 0.6647802275362388
values: 0.14060161766328294
values: 0.2771374645480985
values: 0.5972852791735968
values: 0.9376840538009811
values: 0.3055840162492778
values: 0.3959083568318503
values: 0.669627397826262
values: 0.05781009027215489
values: 0.31535580182422995
values: 0.42197425353467255
values: 0.663765465086562
values: 0.7223381716766016
values: 0.06320160540769781
values: 0.7545260821673904
values: 0.31775665796123564
values: 0.7078367802893542
values: 0.7233528843338944
values: 0.8134566569046943
values: 0.26488541570716795
values: 0.72530738594891
values: 0.9443419064241455
values: 0.9951241215956158
values: 0.5866703643748258
values: 0.7256735501815762
values: 0.8458151422236926
values: 0.14784247861699173
values: 0.7380432572551165
values: 0.5189494534340429
values: 0.8004776979944521
values: 0.8945870932776523
values: 0.21547189197242256
values: 0.2813587750840635
values: 0.4925798797603491
values: 0.14361987603236204
values: 0.5372751973846441
values: 0.9710768835296164
values: 0.0698767841490362
values: 0.8641970396514891
values: 0.7545606755618388
values: 0.7122742803184413
values: 0.6974227222790047
values: 0.8117569670978347
values: 0.4365486455652098
values: 0.3706828596936296
values: 0.170562442510385
values: 0.39394701264730725
values: 0.09814626442739594
values: 0.5933512547188575
values: 0.45904182275620686
values: 0.7796854524068374
values: 0.978042319358437
values: 0.1274520660270938
values: 0.0447901631855645
values: 0.2721351352746172
values: 0.12799151739606762
values: 0.7276434357376849
values: 0.8686010094768765
values: 0.07982826675884558
values: 0.26720682458350153
values: 0.5989971850838636
values: 0.705423213958317
values: 0.35343281130473214
values: 0.5199886746609936
values: 0.6466286512669338
values: 0.15525782500507268
values: 0.47166639635795893
values: 0.4927522360522425
values: 0.3156108503958961
values: 0.46750432907825445
values: 0.9491679107701392
values: 0.20300753497501778
values: 0.7798330625168591
values: 0.820715957062062
values: 0.009981683551279708
values: 0.9374812680302831
values: 0.13273317075711244
values: 0.9459882355565733
values: 0.047738570805289204
values: 0.5167745459619151
values: 0.12565758436390384
values: 0.8135916518934602
values: 0.5012603580691297
values: 0.4915913294537142
values: 0.710329101399897
values: 0.40945250200715355
values: 0.6817239348940404
values: 0.08476502798614072
values: 0.33636624644226154
values: 0.04086718325461136
values: 0.910801758631036
values: 0.35942972877316826
values: 0.3764657856174899
values: 0.5278287765032823
values: 0.20988497161959407
values: 0.32440097426495984
values: 0.7059028842919635
values: 0.21058259271403112
values: 0.6727717137661621
values: 0.9932716967109171
values: 0.8347804003948003
values: 0.6556301815398304
values: 0.541823131159416
values: 0.5432000492956476
values: 0.3210311578369238
values: 0.39282104415011143
values: 0.33811366056539816
values: 0.10943333907008523
values: 0.1209935952662976
values: 0.4101752168225993
values: 0.8826172035440437
values: 0.9530863224899422
values: 0.7203977643458354
values: 0.4795741461023928
values: 0.9180525045549714
values: 0.9625028527805679
values: 0.5070737933349118
values: 0.5731291303930872
values: 0.995373966865749
values: 0.4046061419827941
values: 0.9078546357489615
values: 0.013296642888238286
values: 0.500087607855155
values: 0.31556275807261225
values: 0.1988433638004271
values: 0.22870241811735004
values: 0.8675085091037273
values: 0.20275911658992218
values: 0.9334679932088872
values: 0.4558630748292568
values: 0.4462061377296065
values: 0.35666451042389147
values: 0.16519787759102966
values: 0.8975334632727425
values: 0.5406181104552787
values: 0.13689204292573076
values: 0.5803976227188142
values: 0.4314477329157014
values: 0.7090077328153647
values: 0.0929633021566667
values: 0.4143102452547204
values: 0.816350787077863
values: 0.48432072460668263
values: 0.3450769576076126
values: 0.7381857507825426
values: 0.3993682606110922
values: 0.3113664002859645
values: 0.3245649678699538
values: 0.7482049331994516
values: 0.2117747302479822
values: 0.4785987872509587
values: 0.40294256496075465
values: 0.6405647206604268
values: 0.9968356701086415
values: 0.10750823416422406
values: 0.28771832194270064
values: 0.4019864765054858
values: 0.9704554475794165
values: 0.721617832668024
values: 0.22873917662678578
values: 0.37123672093798676
values: 0.29969322937540854
values: 0.30663706771354615
values: 0.269049342183323
values: 0.37355995538054443
values: 0.01171334470141705
values: 0.6524901244388978
values: 0.15108862629738595
values: 0.5360323137137989
values: 0.020722910001765182
values: 0.13188897850804793
values: 0.011385425264361149
values: 0.4348010057735693
values: 0.7078134320342155
values: 0.5277804102626256
values: 0.23244564596315664
values: 0.6465738203945826
values: 0.9488869176731716
values: 0.036640974606355625
values: 0.3693353554791605
values: 0.31849653017043567
values: 0.21905578724045116
values: 0.24623451075073377
values: 0.426516155002016
values: 0.6885901036295276
values: 0.21500659973134795
values: 0.4370332703931602
values: 0.08976216746923815
values: 0.525063444445776
values: 0.07221779687128316
values: 0.33799206277956517
values: 0.9190849110792108
values: 0.05706172228560902
values: 0.43947948503272083
values: 0.3960577730730783
values: 0.7100532042935843
values: 0.6924040228382837
values: 0.9210204279400606
values: 0.6466877588004382
values: 0.41030249829164545
values: 0.9058056698082396
values: 0.48866762542147035
values: 0.9699685226185767
values: 0.9020196592748602
values: 0.48510992802272224
values: 0.08441380054487113
values: 0.8421920177027633
values: 0.11709678342034013
values: 0.8532720011649287
values: 0.2451545444542087
values: 0.5884486154426892
values: 0.11607911199705223
values: 0.6774676070696881
values: 0.23818013788862913
values: 0.6294261725298481
values: 0.3591605315335321
values: 0.9366991479498308
values: 0.13718669443343423
values: 0.41669146459690953
values: 0.20560875856222216
values: 0.16223396577912308
values: 0.9874286213575769
values: 0.38684933752025596
values: 0.6692220301420121
values: 0.49817019657495654
values: 0.20319257377366617
values: 0.8301939734561454
values: 0.289242072648237
values: 0.9152224408390173
values: 0.6396685092150689
values: 0.36148432082360715
values: 0.29099997778126896
values: 0.9488195614060585
values: 0.917500868789783
values: 0.9623467380249572
values: 0.7203079968951823
values: 0.5463395433735262
values: 0.42097157281311914
values: 0.4434243126542624
values: 0.7813308885807294
values: 0.32799555236085376
values: 0.49398675419465554
values: 0.16081476479455714
values: 0.9358687349962205
values: 0.8450740070791727
values: 0.4572271345067197
values: 0.021315267730505072
values: 0.017827547329430593
values: 0.7679243031413665
values: 0.6382635927256938
values: 0.709855081228824
values: 0.18641111389863896
values: 0.22947685506453952
values: 0.7576749231406072
values: 0.4052832334298716
values: 0.6909268697149742
values: 0.31497902041543757
values: 0.17560783037661254
values: 0.533001739414932
values: 0.08885239188240035
values: 0.10196178451245275
values: 0.28403581345724593
values: 0.47649630254741404
values: 0.5780980054017543
values: 0.6289215819423972
values: 0.5785415542377883
values: 0.44028176009962505
values: 0.037712140717157006
values: 0.9903521479933168
values: 0.5087110198613572
values: 0.8416513744720527
values: 0.7935273081514088
values: 0.316696757854377
values: 0.25618154881202204
values: 0.49970502444979314
values: 0.9750520266748518
values: 0.21154409314592793
values: 0.6820824940194924
values: 0.14458307275775528
values: 0.49871874981182696
values: 0.9101375182403431
values: 0.4566488231075235
values: 0.5190633313309723
values: 0.03761868698711612
values: 0.8959241076898997
values: 0.9956795641251495
values: 0.9124026885910052
values: 0.8771529521906841
values: 0.9746106339921615
values: 0.09050132695945712
values: 0.22592152552042621
values: 0.8351603677260705
values: 0.13152596819626416
values: 0.08222822010226938
values: 0.39627625715244386
values: 0.7455456523896848
values: 0.035379068839347494
values: 0.5351548252469137
values: 0.6160520773486174
values: 0.25895170379992405
values: 0.8047859700242561
values: 0.6133231682938037
values: 0.8293408626573565
values: 0.23571573511526933
values: 0.2400163020691628
values: 0.4659787365933422
values: 0.5162713562648295
values: 0.01544941105499864
values: 0.6980509488657671
values: 0.15839831056540865
values: 0.17692555786943653
values: 0.19493960399955046
values: 0.6258275742192267
values: 0.9834136394321094
values: 0.937758511910898
values: 0.7758392315924554
values: 0.30401012096778
values: 0.8091374292060113
values: 0.9014075875503795
values: 0.5215968586017095
values: 0.9765602588360804
values: 0.49927384461803603
values: 0.6526239780264143
values: 0.4185468883801359
values: 0.38887382347259225
values: 0.30807500760458484
values: 0.005068956303728167
values: 0.443686331502256
values: 0.3379990233075232
values: 0.9685275757426719
values: 0.9042808932150844
values: 0.16226525177610174
values: 0.3692737836496336
values: 0.4416839847934151
values: 0.7651496536985714
values: 0.8213035714568577
values: 0.033262250669029325
}
}
Response:
{'data': {'names': ['t:0', 't:1', 't:2', 't:3', 't:4', 't:5', 't:6', 't:7', 't:8', 't:9'], 'tensor': {'shape': [1, 10], 'values': [7.59969411e-22, 5.43922634e-34, 0.414125204, 0.585804045, 3.04793821e-29, 7.07601721e-05, 1.3436987e-12, 1.51228816e-17, 2.97117952e-09, 3.69011927e-20]}}, 'meta': {'requestPath': {'mnist-model': 'seldonio/tfserving-proxy:1.18.0'}}}
gRPC Request¶
[37]:
r = sc.predict(gateway="istio", transport="grpc", shape=(1, 784))
print(r)
assert r.success == True
Success:True message:
Request:
{'meta': {}, 'data': {'tensor': {'shape': [1, 784], 'values': [0.36172822444251884, 0.4699208601130421, 0.04599955578582804, 0.34137886436611387, 0.40651901318089734, 0.11725893073124771, 0.08081417032125349, 0.6187537605124053, 0.6572519990872054, 0.22070654283129476, 0.14889592272489427, 0.6104922652730083, 0.10322906851201841, 0.27183310335921584, 0.5855684888514012, 0.42078354827751796, 0.672782225323545, 0.2211601879314843, 0.5768984691728823, 0.42885906901903037, 0.40884498929123125, 0.08394924428324924, 0.24106290579477363, 0.6005980762674409, 0.3923047694739721, 0.8139942032757727, 0.6311803028136543, 0.4017185415933836, 0.6317912433120589, 0.8611012609994128, 0.2575202918955518, 0.9212772313103703, 0.4970618587075495, 0.22865387697765416, 0.4863167559295244, 0.9278355712441783, 0.9985559720177586, 0.7497805206325209, 0.2175860779635055, 0.2512634254597744, 0.9202650545933119, 0.46307333534983375, 0.911638722228198, 0.10659921520125704, 0.7065009093400467, 0.21393703737481884, 0.23544369648563712, 0.04763143531310843, 0.33148046643489193, 0.9570226964010952, 0.029099045305251647, 0.03738442865438951, 0.23575487637626347, 0.051413536135447524, 0.5927533526661493, 0.4077456693428043, 0.9657847947662764, 0.6894412555790713, 0.8298825864592176, 0.23209967395724895, 0.04894910268377517, 0.24911449696463905, 0.35025428559729055, 0.3210785258552953, 0.25188424911556073, 0.6214430769553738, 0.8918294656861748, 0.14083496959371444, 0.9037711777796075, 0.19874132515136256, 0.04002312798305829, 0.5036141546871115, 0.1547324112321683, 0.2701211153191131, 0.6762711217517801, 0.9092341266692568, 0.28183117134794444, 0.42715079757397845, 0.4931476335384225, 0.11640370088721408, 0.8378652827895045, 0.6723328499818876, 0.36491868464622534, 0.4658626742953912, 0.553191028093085, 0.6711799551640173, 0.22003488852281983, 0.23797382410492174, 0.9886983315383041, 0.5333168793794157, 0.05011542170278149, 0.6573869266081146, 0.23058722314161706, 0.17610486052386076, 0.0017029854098656871, 0.3809875641504479, 0.15487573838305713, 0.9681279055140405, 0.7937132876125764, 0.7731697342063326, 0.35224069101576705, 0.33378014111819443, 0.6380160614548303, 0.369616485215967, 0.17342566047971564, 0.1110222248915873, 0.7638350285547162, 0.18920600529077092, 0.13664681223880049, 0.368472611630176, 0.8680508691416866, 0.8015524553475534, 0.03513240032985976, 0.8685825973826709, 0.3755927580322652, 0.5132133849810869, 0.7180276972483199, 0.3795832946622183, 0.6095944043385754, 0.714062344190817, 0.17645832417380736, 0.003817765141316709, 0.6713681240529326, 0.027265993190543236, 0.2582361440333826, 0.123239134004822, 0.26004413128089565, 0.6358006283681316, 0.2214610229646592, 0.8731669051276718, 0.7560781731379562, 0.5970044076759972, 0.5776659854155435, 0.3172453941539888, 0.05014425900154673, 0.7971601247265422, 0.016950910821909915, 0.4541873916140997, 0.22826836843130738, 0.9466524049563818, 0.15047007983752336, 0.038187720711298945, 0.7859658299353078, 0.6137380779510436, 0.9504043532759825, 0.5807343943548439, 0.5343780996004118, 0.8285473359218566, 0.4275048777417326, 0.3839646867709884, 0.3810736029970343, 0.7974471807343236, 0.6242735570296196, 0.00971878886776778, 0.5581922135075682, 0.165003148980851, 0.6344671937113595, 0.5580509523542617, 0.7112867380874452, 0.11398835347179481, 0.7636441810106097, 0.9126315668027544, 0.38649411664331457, 0.4278748475538925, 0.41814584937543187, 0.7178709356920981, 0.7773153569313319, 0.8222237278195554, 0.31664510816819125, 0.7664068945052029, 0.7608344862932269, 0.3978418582694502, 0.8721658464609401, 0.1901974472233049, 0.3142831317719974, 0.16930520286795192, 0.028849289227044328, 0.037390216128801335, 0.46181680672586345, 0.40399432844086924, 0.919592367938116, 0.8312484039242999, 0.5994427931604619, 0.5181677917587025, 0.8245009147097506, 0.2162743925743349, 0.49840058263957865, 0.6316456225389769, 0.1160402940126064, 0.02602401524886655, 0.7554512212044698, 0.7376368018177055, 0.40580211758828133, 0.34388623635226845, 0.8146066260265905, 0.7119681683350676, 0.1040364317859398, 0.6408268904196793, 0.6713909390755525, 0.12026041256113884, 0.08503614029634554, 0.8622927549632072, 0.6155371377792849, 0.0163917067307201, 0.25865838870222535, 0.028954582824555763, 0.5955277148193534, 0.9632329837662499, 0.6470832170458125, 0.2171294095433295, 0.6699718255271659, 0.03937910138117828, 0.01593847359320244, 0.011386005357569395, 0.0021949778334018344, 0.8110601598509197, 0.8570134922859485, 0.6755054395104271, 0.21348205503125584, 0.221674840574153, 0.1001815027921319, 0.1590459345123697, 0.9306865103658774, 0.5625490137238318, 0.32909674171166947, 0.48245615796166985, 0.29518820837971305, 0.677639420617533, 0.9466928571641985, 0.7639407374530865, 0.9872720889667421, 0.41215970076070074, 0.8730648593350384, 0.1876832016849842, 0.635077948190888, 0.8169513049189893, 0.7902270500193581, 0.052818569319401654, 0.8873326754291323, 0.8057775805191949, 0.676449162442058, 0.8554561297827931, 0.180054638857865, 0.700105282220434, 0.20541093361386054, 0.4434799627758741, 0.6038336124702441, 0.5176476499915855, 0.48002138707542985, 0.10569080685836396, 0.05347188238240741, 0.6457583580389301, 0.46735511724859824, 0.016178550918255707, 0.30570601873154457, 0.9779955435735376, 0.1099451855203939, 0.032187942347736476, 0.09729687155129207, 0.6921144830915181, 0.13232565643546979, 0.45542955583648004, 0.07912590528826391, 0.5754788937669151, 0.8990541273272563, 0.50422286067046, 0.964932582280955, 0.21284444713540718, 0.6681766827111973, 0.8255006421136156, 0.16905920625504256, 0.9227334300918478, 0.9524913052148344, 0.3849200699535986, 0.8562712849626998, 0.2759527893657805, 0.3900195813884323, 0.8493654156681852, 0.48455075503642786, 0.010966107491165111, 0.221609157944615, 0.8242147544597527, 0.6539414941341996, 0.46850767052334974, 0.03295784546560898, 0.5969874645058812, 0.7991493164045156, 0.3317323277767359, 0.00952470597082078, 0.42452987310666324, 0.215017573692042, 0.9671197256298467, 0.5272275093125877, 0.5820122753184706, 0.28492736736636826, 0.6409045238536697, 0.23648123532981236, 0.9967641128432538, 0.3285365854966682, 0.7282305218876363, 0.4365008838978639, 0.2262544603553388, 0.18438485116943815, 0.81517454564111, 0.670765511344401, 0.287560588438839, 0.12512387294112237, 0.7168328568366962, 0.7989748576921452, 0.8371656558827189, 0.4461854282423453, 0.016220102314146834, 0.9904045202696792, 0.6774446693011152, 0.5515115613709252, 0.46206501101227826, 0.38542396738593154, 0.7786017732469093, 0.5498091766075733, 0.7313162423975503, 0.4180560275082045, 0.9035358215338767, 0.7490241926523257, 0.513780343152126, 0.4489836337923335, 0.39133619168688194, 0.27274950229284867, 0.17626909192710227, 0.8924252461383168, 0.49409914576754377, 0.5345116670989836, 0.7401857846781463, 0.25207284025506993, 0.182523412241404, 0.4968633367319527, 0.6601263521401585, 0.30753533301842917, 0.25272917606935874, 0.29980513258940067, 0.6732569921902026, 0.722081657854739, 0.49709427032128684, 0.2155902441195805, 0.04600485498745832, 0.5714267999178096, 0.7185559472528144, 0.23133483467513927, 0.4202290687955842, 0.9219503801336316, 0.0871241580964861, 0.4875751041176749, 0.4315411360799686, 0.5089938526460501, 0.07489955594032449, 0.44439875635382464, 0.3987576070475104, 0.05702753215844936, 0.3120611897693133, 0.2799182073165809, 0.10770920520997129, 0.8822989822403345, 0.5272719446038057, 0.6268964110536602, 0.3617123473191003, 0.4168993184173069, 0.8512341964854155, 0.24349409716866433, 0.48630293983557216, 0.6164463436828407, 0.06719774634575926, 0.8005282820261004, 0.6758246279243451, 0.6947374442370678, 0.14429375464536642, 0.3338738924296306, 0.05549157377925684, 0.6481998758367093, 0.40964105180478816, 0.1289216965693757, 0.3819577233941325, 0.9376761532992717, 0.2175709769270433, 0.5691163132968087, 0.3323166678751658, 0.7773021896971989, 0.018180916766491584, 0.3722535435680677, 0.3481026294876146, 0.23880384484734507, 0.733614265717547, 0.24279288307094393, 0.3965637252759765, 0.2201337753095326, 0.4471118789507452, 0.5415772926249419, 0.04640160539004523, 0.4679767609016281, 0.7762113871038807, 0.767622335988761, 0.6248144420421096, 0.0757119956392387, 0.33851236817404207, 0.7919861625320312, 0.2759819136018802, 0.38073137451412475, 0.2114668134433737, 0.2669446439716967, 0.5957750915791805, 0.14901192614861214, 0.9624125030489701, 0.6492635310115435, 0.7259204110876233, 0.4723974569525097, 0.5352481319483444, 0.6638508802850236, 0.9455972354799081, 0.46370177878557917, 0.18534766878864617, 0.8417908207490746, 0.1822897082119186, 0.6094381838605868, 0.9815449232406227, 0.6068173630144823, 0.9339302164145744, 0.4966189163275848, 0.880269950242075, 0.6804751459264525, 0.33012306112464296, 0.27273658488228225, 0.5260032957321898, 0.8272611470004578, 0.465390099476221, 0.9798296935876867, 0.87205474127178, 0.8371079847043617, 0.8600555541116339, 0.14310596501907746, 0.4570340047546829, 0.11254757166472673, 0.7293817420556507, 0.8734991003213904, 0.7977729194031019, 0.022516587907774666, 0.5432931613378962, 0.25402034327745127, 0.3629023378350461, 0.7480295382887021, 0.6362184103776304, 0.7755734390027632, 0.5046453535874189, 0.2692866854037279, 0.9752523062622446, 0.599426177904707, 0.673407804871196, 0.31518414241791237, 0.23899599082793577, 0.8293157714681563, 0.9368685899271084, 0.11454047304602633, 0.5330388239545322, 0.509838105556933, 0.6973423144745339, 0.7339068597928691, 0.06514239665853061, 0.10742681430112433, 0.2754204182558214, 0.7901999947827151, 0.2475079135775009, 0.5573536326280896, 0.7713463525163728, 0.744584022959336, 0.44501826532292044, 0.3038046268390062, 0.024606165907325672, 0.7160323167789014, 0.4366346677327386, 0.8090948746512642, 0.5770366855305255, 0.4303613935334716, 0.5839778034567215, 0.5524211173905371, 0.03154232266062773, 0.5124929072269153, 0.4533573107113379, 0.004179309517334806, 0.29238677129550217, 0.843590384267741, 0.4514742164442478, 0.47398598735335506, 0.6709344995247073, 0.7469226064295367, 0.8409405235192349, 0.8911099341370381, 0.367922039335859, 0.4560500517582783, 0.17320436581088705, 0.9136652576308856, 0.18941037202222932, 0.41758839569331796, 0.818976132596428, 0.777181705356038, 0.17956647787144087, 0.6851057842851747, 0.8656971963992834, 0.46713236987191775, 0.1969133816544293, 0.1153712779967897, 0.6479089005941221, 0.22841091746256437, 0.34835990641789694, 0.8268224498484742, 0.837101752985444, 0.35787056332052625, 0.8194304097323828, 0.508740591647388, 0.9084758298239752, 0.05187436378921162, 0.15348447978502655, 0.1367973791270064, 0.8263912806339319, 0.9445503872580792, 0.7226982213475992, 0.9078856251092489, 0.5356297224429316, 0.9024261522673223, 0.9477204095425602, 0.05135119913302799, 0.071287603842903, 0.16147744966735755, 0.9858466844457068, 0.3227666976208563, 0.9637406870971075, 0.060842668225130625, 0.9159489325806027, 0.820857057349457, 0.26716198195910046, 0.9704892606229839, 0.01631620036901882, 0.4254302863916116, 0.0012357482848420753, 0.35870927549571263, 0.4920511197840505, 0.8440210513651434, 0.9942543334719585, 0.8051992439530637, 0.7277071724192206, 0.22738473988876962, 0.9937476584942749, 0.38668157059577757, 0.010553993712293375, 0.7306155278611159, 0.4096914302017377, 0.18038583073717207, 0.04300853283640538, 0.27147382531726894, 0.538711947268488, 0.694376653219236, 0.9130270974261688, 0.6960424462398794, 0.9241026342450148, 0.20826634299076707, 0.07417635669252454, 0.4967971232358235, 0.21026412213601242, 0.1988117569534178, 0.37945200935010437, 0.6192278727118555, 0.10470273298301536, 0.4152189262157976, 0.6763621605519803, 0.8000805836878234, 0.7818285462335886, 0.04783543884651276, 0.14288348671189954, 0.7065028507599759, 0.9343068339153886, 0.8037658509200442, 0.26320949409373706, 0.6472272533125383, 0.18756030256799772, 0.9383000394782798, 0.721221548777633, 0.33132273298991144, 0.23920701954397827, 0.32746247600235223, 0.9183905590776535, 0.43907354058729375, 0.3403024674321262, 0.7026286725486899, 0.23628617069977542, 0.9068898296672114, 0.1252887120751054, 0.6757961950400322, 0.05203870520031384, 0.3032729881552577, 0.893071501757929, 0.5320707365613705, 0.3292529704822045, 0.9754645513550672, 0.1301012180696588, 0.9807534753189012, 0.06784164302212903, 0.712520362613039, 0.04405024418648429, 0.3899052833359733, 0.7046375289926156, 0.042065534547056194, 0.4844731793523469, 0.5554940951392807, 0.9537340470865582, 0.08511243318497952, 0.8753664778931017, 0.00736706514327834, 0.859259256742231, 0.7889082881724792, 0.13832417312906353, 0.4705314259181852, 0.16589656762755522, 0.7948403546460092, 0.5061953631612541, 0.9638514490798734, 0.3791858223222698, 0.16956008204465323, 0.4585852002226971, 0.6387660747743582, 0.2780332528935979, 0.2234661105102861, 0.7941072995479508, 0.4759425934483301, 0.8790249027799297, 0.31510770682371847, 0.2550831378102911, 0.027809048339305753, 0.009009002815855593, 0.5637305850425558, 0.003467245196914992, 0.4905683184857107, 0.7048920080232052, 0.8851649314326937, 0.6994497606782164, 0.6083633349667298, 0.9802749571823776, 0.8558290330634459, 0.42996174973223966, 0.17623182701502116, 0.26906937456331803, 0.6371898320903062, 0.9794912476384843, 0.22808477245342396, 0.10858324170953848, 0.7398622432232416, 0.6062665691702482, 0.20585148098516282, 0.2080539213516488, 0.9011332822136765, 0.35739539756255767, 0.2305698164006309, 0.02827091824975947, 0.6433451558581672, 0.39981410225959657, 0.5037549648597693, 0.03146299365652638, 0.7723121612370175, 0.7648955075923967, 0.3564612792945471, 0.6781058092152946, 0.11276187757058309, 0.7349814250145641, 0.7819197645972432, 0.8294261218556405, 0.1494415793140884, 0.7545724516422709, 0.3321866115899804, 0.10670047539712291, 0.5022879639259178, 0.1628006046463828, 0.29739831438499453, 0.05950268453460705, 0.2464917886197725, 0.8148807187821981, 0.4690447071246139, 0.2949095281526698, 0.7416230380595642, 0.005940578395413398, 0.4657636296423261, 0.22096331053476592, 0.694911199746227, 0.11030565102268486, 0.1604059397829568, 0.1967968433837487, 0.8143757483895765, 0.7574818080341754, 0.9538634833257315, 0.9826579126047408, 0.5955880403975019, 0.7972569716188557, 0.9443329497825466, 0.9462284339037835, 0.9089310522468065, 0.3943161425587247, 0.6552510921667604, 0.11107108184317538, 0.8284805211122891, 0.9816716829408518, 0.14067118553190805, 0.7188689411375162, 0.3781486815078553, 0.5862083765446148, 0.22962420682484852, 0.18745568534104218, 0.491700879948591, 0.36663737165466015, 0.6368353807632656, 0.9569602524196934, 0.7662018152718745, 0.8014941033116278, 0.8479271299599976, 0.5777586126147234, 0.6612690358680527, 0.5893925089042976, 0.5706828976817035, 0.4097568708453512, 0.8374657307884478, 0.051915814879910815, 0.5717604147561465, 0.8281736096390033, 0.6820977994060075, 0.8612875981688402, 0.017328877021735578, 0.5156209955398405, 0.44574743854917287, 0.616642868895055, 0.35785650731738894, 0.5720619959487966, 0.13959897863528714, 0.8803954435107009, 0.7326605981010983, 0.6719095549106356, 0.9924842720850703, 0.5256994568094662, 0.855540663997085, 0.48752210418606445, 0.20399154573630474, 0.22210206719933834, 0.08598069172071454, 0.9317430977066767, 0.3340322951924174, 0.9700725811598967, 0.8712027584095137, 0.9215887823413008, 0.11963766645587781, 0.3545338282367604, 0.3768584616518422, 0.25274336790611585, 0.42797036523360976, 0.46903868452022257, 0.7887774884730114, 0.20703188248944093, 0.02148408912083799, 0.33078507700961557, 0.9062172200642712, 0.0853297344012387, 0.6764361708963051, 0.7839276159847391, 0.0383403194391424, 0.8663691706547855, 0.2674448156304343, 0.6516343016908693, 0.5108898718355328, 0.4299453699747384, 0.5407619404085043, 0.7628579525808299, 0.2129035029296943, 0.2172721695416101, 0.6216465164695023, 0.348777988195458, 0.8339409116853125, 0.40458851521703354, 0.04968991766666253, 0.4475062776140257, 0.7304953412093921, 0.33601192720028106, 0.5446406078462704, 0.19544744163054317]}}}
Response:
{'data': {'tftensor': {'dtype': 'DT_FLOAT', 'tensorShape': {'dim': [{'size': '1'}, {'size': '10'}]}, 'floatVal': [1.0785601e-24, 6.920695e-34, 0.9999403, 5.9354432e-05, 4.8367345e-34, 3.337407e-07, 4.5185787e-18, 1.992394e-21, 7.845255e-12, 5.545996e-20]}}}
And delete the model we deployed
[38]:
!kubectl delete -f ./resources/mnist_rest.yaml
seldondeployment.machinelearning.seldon.io "tfserving" deleted
Serve Tensorflow Model with Tensorflow protocol¶
The config file below shows how you can deploy your Tensorflow model which exposes the Tensorflow protocol.
[39]:
%%writefile ./resources/halfplustwo_rest.yaml
apiVersion: machinelearning.seldon.io/v1alpha2
kind: SeldonDeployment
metadata:
name: hpt
spec:
name: hpt
protocol: tensorflow
transport: rest
predictors:
- graph:
children: []
implementation: TENSORFLOW_SERVER
modelUri: gs://seldon-models/tfserving/half_plus_two
name: halfplustwo
parameters:
- name: model_name
type: STRING
value: halfplustwo
name: default
replicas: 1
Overwriting ./resources/halfplustwo_rest.yaml
[40]:
!kubectl apply -f ./resources/halfplustwo_rest.yaml
seldondeployment.machinelearning.seldon.io/hpt created
[41]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=hpt -o jsonpath='{.items[0].metadata.name}')
Waiting for deployment "hpt-default-0-halfplustwo" rollout to finish: 0 of 1 updated replicas are available...
deployment "hpt-default-0-halfplustwo" successfully rolled out
[42]:
import json
X=!curl -s -d '{"instances": [1.0, 2.0, 5.0]}' \
-X POST http://localhost:8004/seldon/seldon/hpt/v1/models/halfplustwo/:predict \
-H "Content-Type: application/json"
d=json.loads("".join(X))
print(d)
assert(d["predictions"][0] == 2.5)
{'predictions': [2.5, 3.0, 4.5]}
[43]:
X=!cd ../executor/proto && grpcurl \
-d '{"model_spec":{"name":"halfplustwo"},"inputs":{"x":{"dtype": 1, "tensor_shape": {"dim":[{"size": 3}]}, "floatVal" : [1.0, 2.0, 3.0]}}}' \
-rpc-header seldon:hpt -rpc-header namespace:seldon \
-plaintext -proto ./prediction_service.proto \
0.0.0.0:8004 tensorflow.serving.PredictionService/Predict
d=json.loads("".join(X))
print(d)
assert(d["outputs"]["x"]["floatVal"][0] == 2.5)
{'outputs': {'x': {'dtype': 'DT_FLOAT', 'tensorShape': {'dim': [{'size': '3'}]}, 'floatVal': [2.5, 3, 3.5]}}, 'modelSpec': {'name': 'halfplustwo', 'version': '123', 'signatureName': 'serving_default'}}
[44]:
!kubectl delete -f ./resources/halfplustwo_rest.yaml
seldondeployment.machinelearning.seldon.io "hpt" deleted
Serve MLFlow Elasticnet Wines Model¶
In order to deploy MLflow models, we can leverage the pre-packaged MLflow inference server. The exposed API can follow either:
The default Seldon protocol.
The V2 protocol.
Default Seldon protocol¶
We can deploy an MLFlow model uploaded to an object store by using the MLFlow model server implementation as the config below:
[45]:
%%writefile ./resources/elasticnet_wine.yaml
apiVersion: machinelearning.seldon.io/v1alpha2
kind: SeldonDeployment
metadata:
name: mlflow
spec:
name: wines
predictors:
- componentSpecs:
- spec:
# We are setting high failureThreshold as installing conda dependencies
# can take long time and we want to avoid k8s killing the container prematurely
containers:
- name: classifier
livenessProbe:
initialDelaySeconds: 80
failureThreshold: 200
periodSeconds: 5
successThreshold: 1
httpGet:
path: /health/ping
port: http
scheme: HTTP
readinessProbe:
initialDelaySeconds: 80
failureThreshold: 200
periodSeconds: 5
successThreshold: 1
httpGet:
path: /health/ping
port: http
scheme: HTTP
graph:
children: []
implementation: MLFLOW_SERVER
modelUri: gs://seldon-models/v1.18.0/mlflow/elasticnet_wine
name: classifier
name: default
replicas: 1
Overwriting ./resources/elasticnet_wine.yaml
[46]:
!kubectl apply -f ./resources/elasticnet_wine.yaml
seldondeployment.machinelearning.seldon.io/mlflow created
[47]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=mlflow -o jsonpath='{.items[0].metadata.name}')
Waiting for deployment "mlflow-default-0-classifier" rollout to finish: 0 of 1 updated replicas are available...
deployment "mlflow-default-0-classifier" successfully rolled out
REST requests¶
[48]:
X=!curl -s -d '{"data": {"ndarray":[[0.1,0.2,0.3,0.4,0.5,0.6,0.7,0.8,0.9,1.0,1.1]]}}' \
-X POST http://localhost:8004/seldon/seldon/mlflow/api/v1.0/predictions \
-H "Content-Type: application/json"
d=json.loads(X[0])
print(d)
{'data': {'names': [], 'ndarray': [5.275558760255375]}, 'meta': {'requestPath': {'classifier': 'seldonio/mlflowserver:1.18.0'}}}
[49]:
from seldon_core.seldon_client import SeldonClient
sc = SeldonClient(deployment_name="mlflow", namespace="seldon")
[50]:
r = sc.predict(gateway="istio", transport="rest", shape=(1, 11))
print(r)
assert r.success == True
Success:True message:
Request:
meta {
}
data {
tensor {
shape: 1
shape: 11
values: 0.5653756616147143
values: 0.8420723980045103
values: 0.5062962018951862
values: 0.5278571532571091
values: 0.8664864216170011
values: 0.45973619467487525
values: 0.9263453669947618
values: 0.6150362295808135
values: 0.5089439556920178
values: 0.9282257119213217
values: 0.7953848591020709
}
}
Response:
{'data': {'names': [], 'tensor': {'shape': [1], 'values': [5.244154959197477]}}, 'meta': {'requestPath': {'classifier': 'seldonio/mlflowserver:1.18.0'}}}
gRPC Requests¶
[51]:
X=!cd ../executor/proto && grpcurl -d '{"data":{"ndarray":[[0.1,0.2,0.3,0.4,0.5,0.6,0.7,0.8,0.9,1.0,1.1]]}}' \
-rpc-header seldon:mlflow -rpc-header namespace:seldon \
-plaintext \
-proto ./prediction.proto 0.0.0.0:8004 seldon.protos.Seldon/Predict
d=json.loads("".join(X))
print(d)
{'meta': {'requestPath': {'classifier': 'seldonio/mlflowserver:1.18.0'}}, 'data': {'ndarray': [5.275558760255375]}}
[52]:
r = sc.predict(gateway="istio", transport="grpc", shape=(1, 11))
print(r)
assert r.success == True
Success:True message:
Request:
{'meta': {}, 'data': {'tensor': {'shape': [1, 11], 'values': [0.49331471179570063, 0.6224359102414585, 0.9804140644783785, 0.3380112388366434, 0.8942524358731484, 0.2837043142995588, 0.9867737754039229, 0.47015800689221765, 0.5398691845908773, 0.6674452878134565, 0.4007242201724862]}}}
Response:
{'meta': {'requestPath': {'classifier': 'seldonio/mlflowserver:1.18.0'}}, 'data': {'tensor': {'shape': [1], 'values': [5.204512993729161]}}}
[53]:
!kubectl delete -f ./resources/elasticnet_wine.yaml
seldondeployment.machinelearning.seldon.io "mlflow" deleted
V2 protocol¶
We can deploy a MLflow model, exposing an API compatible with v2 protocol by specifying the protocol
of our SeldonDeployment
as v2
. For example, we can consider the config below:
[54]:
%%writefile ./resources/elasticnet_wine_v2.yaml
apiVersion: machinelearning.seldon.io/v1alpha2
kind: SeldonDeployment
metadata:
name: mlflow
spec:
protocol: v2 # Activate v2 protocol
name: wines
predictors:
- graph:
children: []
implementation: MLFLOW_SERVER
modelUri: gs://seldon-models/v1.12.0-dev/mlflow/elasticnet_wine
name: classifier
name: default
replicas: 1
Overwriting ./resources/elasticnet_wine_v2.yaml
[55]:
!kubectl apply -f ./resources/elasticnet_wine_v2.yaml
seldondeployment.machinelearning.seldon.io/mlflow created
[56]:
!kubectl rollout status deploy/$(kubectl get deploy -l seldon-deployment-id=mlflow -o jsonpath='{.items[0].metadata.name}')
Waiting for deployment "mlflow-default-0-classifier" rollout to finish: 0 of 1 updated replicas are available...
deployment "mlflow-default-0-classifier" successfully rolled out
Once it’s deployed, we can send inference requests to our model. Note that, since it’s using the V2 Protocol, these requests will be different to the ones using the default Seldon Protocol.
[57]:
import json
import requests
inference_request = {
"parameters": {"content_type": "pd"},
"inputs": [
{
"name": "fixed acidity",
"shape": [1],
"datatype": "FP32",
"data": [7.4],
"parameters": {"content_type": "np"},
},
{
"name": "volatile acidity",
"shape": [1],
"datatype": "FP32",
"data": [0.7000],
"parameters": {"content_type": "np"},
},
{
"name": "citric acidity",
"shape": [1],
"datatype": "FP32",
"data": [0],
"parameters": {"content_type": "np"},
},
{
"name": "residual sugar",
"shape": [1],
"datatype": "FP32",
"data": [1.9],
"parameters": {"content_type": "np"},
},
{
"name": "chlorides",
"shape": [1],
"datatype": "FP32",
"data": [0.076],
"parameters": {"content_type": "np"},
},
{
"name": "free sulfur dioxide",
"shape": [1],
"datatype": "FP32",
"data": [11],
"parameters": {"content_type": "np"},
},
{
"name": "total sulfur dioxide",
"shape": [1],
"datatype": "FP32",
"data": [34],
"parameters": {"content_type": "np"},
},
{
"name": "density",
"shape": [1],
"datatype": "FP32",
"data": [0.9978],
"parameters": {"content_type": "np"},
},
{
"name": "pH",
"shape": [1],
"datatype": "FP32",
"data": [3.51],
"parameters": {"content_type": "np"},
},
{
"name": "sulphates",
"shape": [1],
"datatype": "FP32",
"data": [0.56],
"parameters": {"content_type": "np"},
},
{
"name": "alcohol",
"shape": [1],
"datatype": "FP32",
"data": [9.4],
"parameters": {"content_type": "np"},
},
],
}
endpoint = "http://localhost:8004/seldon/seldon/mlflow/v2/models/infer"
response = requests.post(endpoint, json=inference_request)
print(json.dumps(response.json(), indent=2))
assert response.ok
{
"model_name": "classifier",
"model_version": "v1",
"id": "732f2602-cc8a-479d-86c8-ee99bafd1a1e",
"parameters": null,
"outputs": [
{
"name": "predict",
"shape": [
1
],
"datatype": "FP64",
"parameters": null,
"data": [
6.016145744177844
]
}
]
}
[58]:
!kubectl delete -f ./resources/elasticnet_wine_v2.yaml
seldondeployment.machinelearning.seldon.io "mlflow" deleted