Command Line Scripts

Seldon provides several scripts to aid starting Seldon, provisioning services and stopping Seldon.

seldon-up

Synopsis

Create Seldon on a running kubernetes cluster.

seldon-up

Examples

To launch seldon with all components run

seldon-up

To start with GlusterFS run

SELDON_WITH_GLUSTERFS=true seldon-up

To start without a Spark cluster

SELDON_WITH_SPARK=false seldon-up

seldon-down

Synopsis

Shutdown Seldon running on a Kubernetes cluster

seldon-down

seldon-cli

See the detailed seldon-cli documentation.

start-microservice

Synopsis

Start one or more microservices for a particular client. The microservices can be REST or for predictions also gRPC based. The script allows you to start microservices of two types:

usage: start-microservice [-h] [-i name image microservice API ratio]
[-p name folder microservice API ratio] --client
CLIENT [--replicas REPLICAS] --type
{recommendation,prediction}

optional arguments:
-h, --help show this help message and exit
-i name image microservice API ratio
microservice image defn: <name> <image> <API type
(rest or rpc)> <ratio>
-p name folder microservice API ratio
microservice from pipeline defn: <name> <folder> <API
type (rest or rpc) <ratio>
--client CLIENT client name
--replicas REPLICAS number of replicas
--type {recommendation,prediction}
microservice type

Examples

Start a recommendation microservice from a built Docker image exposed as REST endpoint for the client “reuters”. See the worked Reuters content recommendation example.

start-microservice --type recommendation --client reuters -i reuters-example seldonio/reuters-example:2.0.7 rest 1.0

Start a prediction REST microservice from a saved pipeline previously saved to /seldon-data/seldon-models/finefoods/1 for client “test”. See the worked sentiment analysis demo.

start-microservice --type prediction --client test -p finefoods-xgboost /seldon-data/seldon-models/finefoods/1/ rest 1.0

Start a prediction microservice from an xgboost mode exposed as a REST service and packaged in a docker image. See worked example in the Iris prediction demo.

start-microservice --type prediction --client test -i iris-xgboost seldonio/iris_xgboost:2.1 rest 1.0

Start and AB test with two microservices.

start-microservice --type prediction --client test -i iris-xgboost seldonio/iris_xgboost:2.1 rest 0.5 -i iris-scikit seldonio/iris_scikit:2.1 rest 0.5

Start gRPC microservice for Iris demo

start-microservice --type prediction --client test -i xgboostrpc seldonio/iris_xgboost_rpc:2.1 rpc 1.0

launch-locust-loadtest

Synopsis

Create a locust loadtest. Presently for prediction services only. Handles REST and gRPC

usage: launch-locust-load-test [-h] --seldon-client SELDON_CLIENT
[--locust-slaves LOCUST_SLAVES]
[--locust-hatch-rate LOCUST_HATCH_RATE]
[--locust-clients LOCUST_CLIENTS]
--test-type {js-predict,grpc-predict}
[--seldon-grpc-endpoint SELDON_GRPC_ENDPOINT]
[--seldon-oauth-endpoint SELDON_OAUTH_ENDPOINT]
[--seldon-predict-default-data-size SELDON_PREDICT_DEFAULT_DATA_SIZE]

optional arguments:
-h, --help show this help message and exit
--seldon-client SELDON_CLIENT
client name
--locust-slaves LOCUST_SLAVES
number of slaves to run
--locust-hatch-rate LOCUST_HATCH_RATE
locust hatch rate
--locust-clients LOCUST_CLIENTS
number of locust clients
--test-type {js-predict,grpc-predict}
type of test to run
--seldon-grpc-endpoint SELDON_GRPC_ENDPOINT
seldon grpc endpoint
--seldon-oauth-endpoint SELDON_OAUTH_ENDPOINT
seldon oauth endpoint
--seldon-predict-default-data-size SELDON_PREDICT_DEFAULT_DATA_SIZE
the size of the default list of random floats to send
to predict endpoint

Examples

# launch grpc prediction load test
launch-locust-load-test --seldon-client deep_mnist_client --test-type grpc-predict