This page was generated from examples/models/deep_mnist/deep_mnist.ipynb.
Tensorflow MNIST ModelΒΆ
- Wrap a Tensorflow MNIST python model for use as a prediction microservice in seldon-core
- Run locally on Docker to test
- Deploy on seldon-core running on minikube
Train locallyΒΆ
[1]:
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("MNIST_data/", one_hot = True)
import tensorflow as tf
if __name__ == '__main__':
x = tf.placeholder(tf.float32, [None,784], name="x")
W = tf.Variable(tf.zeros([784,10]))
b = tf.Variable(tf.zeros([10]))
y = tf.nn.softmax(tf.matmul(x,W) + b, name="y")
y_ = tf.placeholder(tf.float32, [None, 10])
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1]))
train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)
init = tf.initialize_all_variables()
sess = tf.Session()
sess.run(init)
for i in range(1000):
batch_xs, batch_ys = mnist.train.next_batch(100)
sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})
correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
print(sess.run(accuracy, feed_dict = {x: mnist.test.images, y_:mnist.test.labels}))
saver = tf.train.Saver()
saver.save(sess, "model/deep_mnist_model")
WARNING:tensorflow:From <ipython-input-1-b7995d30f035>:2: read_data_sets (from tensorflow.contrib.learn.python.learn.datasets.mnist) is deprecated and will be removed in a future version.
Instructions for updating:
Please use alternatives such as official/mnist/dataset.py from tensorflow/models.
WARNING:tensorflow:From /home/clive/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/learn/python/learn/datasets/mnist.py:260: maybe_download (from tensorflow.contrib.learn.python.learn.datasets.base) is deprecated and will be removed in a future version.
Instructions for updating:
Please write your own downloading logic.
WARNING:tensorflow:From /home/clive/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/learn/python/learn/datasets/mnist.py:262: extract_images (from tensorflow.contrib.learn.python.learn.datasets.mnist) is deprecated and will be removed in a future version.
Instructions for updating:
Please use tf.data to implement this functionality.
Extracting MNIST_data/train-images-idx3-ubyte.gz
WARNING:tensorflow:From /home/clive/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/learn/python/learn/datasets/mnist.py:267: extract_labels (from tensorflow.contrib.learn.python.learn.datasets.mnist) is deprecated and will be removed in a future version.
Instructions for updating:
Please use tf.data to implement this functionality.
Extracting MNIST_data/train-labels-idx1-ubyte.gz
WARNING:tensorflow:From /home/clive/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/learn/python/learn/datasets/mnist.py:110: dense_to_one_hot (from tensorflow.contrib.learn.python.learn.datasets.mnist) is deprecated and will be removed in a future version.
Instructions for updating:
Please use tf.one_hot on tensors.
Extracting MNIST_data/t10k-images-idx3-ubyte.gz
Extracting MNIST_data/t10k-labels-idx1-ubyte.gz
WARNING:tensorflow:From /home/clive/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/learn/python/learn/datasets/mnist.py:290: DataSet.__init__ (from tensorflow.contrib.learn.python.learn.datasets.mnist) is deprecated and will be removed in a future version.
Instructions for updating:
Please use alternatives such as official/mnist/dataset.py from tensorflow/models.
WARNING:tensorflow:From /home/clive/anaconda3/lib/python3.6/site-packages/tensorflow/python/util/tf_should_use.py:118: initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.
Instructions for updating:
Use `tf.global_variables_initializer` instead.
0.9159
Wrap model using s2i
[2]:
!s2i build . seldonio/seldon-core-s2i-python36:0.5.1 deep-mnist:0.1
---> Installing application source...
---> Installing dependencies ...
Looking in links: /whl
Requirement already satisfied: tensorflow>=1.12.0 in /usr/local/lib/python3.6/site-packages (from -r requirements.txt (line 1)) (1.12.0)
Requirement already satisfied: keras-applications>=1.0.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.7)
Requirement already satisfied: tensorboard<1.13.0,>=1.12.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.12.2)
Requirement already satisfied: gast>=0.2.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.2.2)
Requirement already satisfied: protobuf>=3.6.1 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.6.1)
Requirement already satisfied: grpcio>=1.8.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.18.0)
Requirement already satisfied: six>=1.10.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.12.0)
Requirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.33.1)
Requirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.16.1)
Requirement already satisfied: keras-preprocessing>=1.0.5 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.9)
Requirement already satisfied: astor>=0.6.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.7.1)
Requirement already satisfied: absl-py>=0.1.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.7.0)
Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.1.0)
Requirement already satisfied: h5py in /usr/local/lib/python3.6/site-packages (from keras-applications>=1.0.6->tensorflow>=1.12.0->-r requirements.txt (line 1)) (2.9.0)
Requirement already satisfied: werkzeug>=0.11.10 in /usr/local/lib/python3.6/site-packages (from tensorboard<1.13.0,>=1.12.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.14.1)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.6/site-packages (from tensorboard<1.13.0,>=1.12.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.0.1)
Requirement already satisfied: setuptools in /usr/local/lib/python3.6/site-packages (from protobuf>=3.6.1->tensorflow>=1.12.0->-r requirements.txt (line 1)) (40.8.0)
Url '/whl' is ignored. It is either a non-existing path or lacks a specific scheme.
Build completed successfully
[3]:
!docker run --name "mnist_predictor" -d --rm -p 5000:5000 deep-mnist:0.1
13b5515bb7074825056c4795cacd4348b6d28272def928380116a1fe9ea30659
Send some random features that conform to the contract
[4]:
!seldon-core-tester contract.json 0.0.0.0 5000 -p
----------------------------------------
SENDING NEW REQUEST:
RECEIVED RESPONSE:
Success:True message:
Request:
data {
ndarray {
values {
list_value {
values {
number_value: 0.948
}
values {
number_value: 0.333
}
values {
number_value: 0.318
}
values {
number_value: 0.966
}
values {
number_value: 0.711
}
values {
number_value: 0.982
}
values {
number_value: 0.776
}
values {
number_value: 0.638
}
values {
number_value: 0.697
}
values {
number_value: 0.714
}
values {
number_value: 0.054
}
values {
number_value: 0.689
}
values {
number_value: 0.264
}
values {
number_value: 0.7
}
values {
number_value: 0.669
}
values {
number_value: 0.338
}
values {
number_value: 0.416
}
values {
number_value: 0.262
}
values {
number_value: 0.432
}
values {
number_value: 0.328
}
values {
number_value: 0.78
}
values {
number_value: 0.714
}
values {
number_value: 0.135
}
values {
number_value: 0.658
}
values {
number_value: 0.965
}
values {
number_value: 0.435
}
values {
number_value: 0.307
}
values {
number_value: 0.423
}
values {
number_value: 0.135
}
values {
number_value: 0.771
}
values {
number_value: 0.137
}
values {
number_value: 0.257
}
values {
number_value: 0.899
}
values {
number_value: 0.903
}
values {
number_value: 0.516
}
values {
number_value: 0.977
}
values {
number_value: 0.813
}
values {
number_value: 0.8
}
values {
number_value: 0.729
}
values {
number_value: 0.64
}
values {
number_value: 0.09
}
values {
number_value: 0.961
}
values {
number_value: 0.419
}
values {
number_value: 0.197
}
values {
number_value: 0.935
}
values {
number_value: 0.964
}
values {
number_value: 0.011
}
values {
number_value: 0.082
}
values {
number_value: 0.699
}
values {
number_value: 0.149
}
values {
number_value: 0.286
}
values {
number_value: 0.087
}
values {
number_value: 0.622
}
values {
number_value: 0.878
}
values {
number_value: 0.231
}
values {
number_value: 0.41
}
values {
number_value: 0.077
}
values {
number_value: 0.599
}
values {
number_value: 0.381
}
values {
number_value: 0.641
}
values {
number_value: 0.701
}
values {
number_value: 0.675
}
values {
number_value: 0.951
}
values {
number_value: 0.535
}
values {
number_value: 0.578
}
values {
number_value: 0.421
}
values {
number_value: 0.976
}
values {
number_value: 0.055
}
values {
number_value: 0.03
}
values {
number_value: 0.779
}
values {
number_value: 0.575
}
values {
number_value: 0.897
}
values {
number_value: 0.166
}
values {
number_value: 0.579
}
values {
number_value: 0.705
}
values {
number_value: 0.843
}
values {
number_value: 0.115
}
values {
number_value: 0.49
}
values {
number_value: 0.51
}
values {
number_value: 0.187
}
values {
number_value: 0.934
}
values {
number_value: 0.149
}
values {
number_value: 0.421
}
values {
number_value: 0.787
}
values {
number_value: 0.222
}
values {
number_value: 0.293
}
values {
number_value: 0.524
}
values {
number_value: 0.957
}
values {
number_value: 0.352
}
values {
number_value: 0.023
}
values {
number_value: 0.547
}
values {
number_value: 0.754
}
values {
number_value: 0.203
}
values {
number_value: 0.209
}
values {
number_value: 0.818
}
values {
number_value: 0.289
}
values {
number_value: 0.212
}
values {
number_value: 0.27
}
values {
number_value: 0.576
}
values {
number_value: 0.627
}
values {
number_value: 0.228
}
values {
number_value: 0.673
}
values {
number_value: 0.966
}
values {
number_value: 0.973
}
values {
number_value: 0.1
}
values {
number_value: 0.48
}
values {
number_value: 0.605
}
values {
number_value: 0.747
}
values {
number_value: 0.903
}
values {
number_value: 0.912
}
values {
number_value: 0.163
}
values {
number_value: 0.031
}
values {
number_value: 0.92
}
values {
number_value: 0.867
}
values {
number_value: 0.695
}
values {
number_value: 0.394
}
values {
number_value: 0.307
}
values {
number_value: 0.024
}
values {
number_value: 0.324
}
values {
number_value: 0.9
}
values {
number_value: 0.975
}
values {
number_value: 0.683
}
values {
number_value: 0.7
}
values {
number_value: 0.028
}
values {
number_value: 0.874
}
values {
number_value: 0.739
}
values {
number_value: 0.57
}
values {
number_value: 0.087
}
values {
number_value: 0.442
}
values {
number_value: 0.463
}
values {
number_value: 0.805
}
values {
number_value: 0.47
}
values {
number_value: 0.754
}
values {
number_value: 0.829
}
values {
number_value: 0.033
}
values {
number_value: 0.266
}
values {
number_value: 0.457
}
values {
number_value: 0.166
}
values {
number_value: 0.686
}
values {
number_value: 0.594
}
values {
number_value: 0.513
}
values {
number_value: 0.476
}
values {
number_value: 0.581
}
values {
number_value: 0.312
}
values {
number_value: 0.248
}
values {
number_value: 0.389
}
values {
number_value: 0.739
}
values {
number_value: 0.87
}
values {
number_value: 0.785
}
values {
number_value: 0.962
}
values {
number_value: 0.173
}
values {
number_value: 0.397
}
values {
number_value: 0.429
}
values {
number_value: 0.575
}
values {
number_value: 0.873
}
values {
number_value: 0.001
}
values {
number_value: 0.247
}
values {
number_value: 0.46
}
values {
number_value: 0.638
}
values {
number_value: 0.537
}
values {
number_value: 0.777
}
values {
number_value: 0.803
}
values {
number_value: 0.786
}
values {
number_value: 0.704
}
values {
number_value: 0.972
}
values {
number_value: 0.175
}
values {
number_value: 0.125
}
values {
number_value: 0.855
}
values {
number_value: 0.476
}
values {
number_value: 0.974
}
values {
number_value: 0.16
}
values {
number_value: 0.054
}
values {
number_value: 0.468
}
values {
number_value: 0.661
}
values {
number_value: 0.212
}
values {
number_value: 0.951
}
values {
number_value: 0.711
}
values {
number_value: 0.364
}
values {
number_value: 0.27
}
values {
number_value: 0.517
}
values {
number_value: 0.325
}
values {
number_value: 0.714
}
values {
number_value: 0.87
}
values {
number_value: 0.017
}
values {
number_value: 0.195
}
values {
number_value: 0.99
}
values {
number_value: 0.505
}
values {
number_value: 0.607
}
values {
number_value: 0.367
}
values {
number_value: 0.928
}
values {
number_value: 0.152
}
values {
number_value: 0.44
}
values {
number_value: 0.654
}
values {
number_value: 0.142
}
values {
number_value: 0.056
}
values {
number_value: 0.348
}
values {
number_value: 0.084
}
values {
number_value: 0.993
}
values {
number_value: 0.008
}
values {
number_value: 0.456
}
values {
number_value: 0.737
}
values {
number_value: 0.843
}
values {
number_value: 0.368
}
values {
number_value: 0.767
}
values {
number_value: 0.572
}
values {
number_value: 0.193
}
values {
number_value: 0.296
}
values {
number_value: 0.634
}
values {
number_value: 0.639
}
values {
number_value: 0.552
}
values {
number_value: 0.457
}
values {
number_value: 0.742
}
values {
number_value: 0.889
}
values {
number_value: 0.133
}
values {
number_value: 0.17
}
values {
number_value: 0.919
}
values {
number_value: 0.996
}
values {
number_value: 0.268
}
values {
number_value: 0.197
}
values {
number_value: 0.207
}
values {
number_value: 0.266
}
values {
number_value: 0.314
}
values {
number_value: 0.395
}
values {
number_value: 0.94
}
values {
number_value: 0.04
}
values {
number_value: 0.146
}
values {
number_value: 0.176
}
values {
number_value: 0.964
}
values {
number_value: 0.265
}
values {
number_value: 0.529
}
values {
number_value: 0.714
}
values {
number_value: 0.524
}
values {
number_value: 0.252
}
values {
number_value: 0.711
}
values {
number_value: 0.537
}
values {
number_value: 0.205
}
values {
number_value: 0.156
}
values {
number_value: 0.172
}
values {
number_value: 0.917
}
values {
number_value: 0.834
}
values {
number_value: 0.634
}
values {
number_value: 0.605
}
values {
number_value: 0.692
}
values {
number_value: 0.689
}
values {
number_value: 0.729
}
values {
number_value: 0.314
}
values {
number_value: 0.873
}
values {
number_value: 0.032
}
values {
number_value: 0.932
}
values {
number_value: 0.353
}
values {
number_value: 0.852
}
values {
number_value: 0.748
}
values {
number_value: 0.316
}
values {
number_value: 0.827
}
values {
number_value: 0.51
}
values {
number_value: 0.444
}
values {
number_value: 0.211
}
values {
number_value: 0.131
}
values {
number_value: 0.129
}
values {
number_value: 0.843
}
values {
number_value: 0.875
}
values {
number_value: 0.378
}
values {
number_value: 0.48
}
values {
number_value: 0.157
}
values {
number_value: 0.665
}
values {
number_value: 0.104
}
values {
number_value: 0.988
}
values {
number_value: 0.705
}
values {
number_value: 0.526
}
values {
number_value: 0.39
}
values {
number_value: 0.955
}
values {
number_value: 0.217
}
values {
number_value: 0.555
}
values {
number_value: 0.274
}
values {
number_value: 0.524
}
values {
number_value: 0.496
}
values {
number_value: 0.708
}
values {
number_value: 0.65
}
values {
number_value: 0.15
}
values {
number_value: 0.925
}
values {
number_value: 0.219
}
values {
number_value: 0.671
}
values {
number_value: 0.287
}
values {
number_value: 0.434
}
values {
number_value: 0.003
}
values {
number_value: 0.239
}
values {
number_value: 0.614
}
values {
number_value: 0.143
}
values {
number_value: 0.43
}
values {
number_value: 0.198
}
values {
number_value: 0.633
}
values {
number_value: 0.017
}
values {
number_value: 0.576
}
values {
number_value: 0.449
}
values {
number_value: 0.375
}
values {
number_value: 0.814
}
values {
number_value: 0.74
}
values {
number_value: 0.879
}
values {
number_value: 0.439
}
values {
number_value: 0.114
}
values {
number_value: 0.132
}
values {
number_value: 0.19
}
values {
number_value: 0.405
}
values {
number_value: 0.089
}
values {
number_value: 0.345
}
values {
number_value: 0.375
}
values {
number_value: 0.152
}
values {
number_value: 0.541
}
values {
number_value: 0.891
}
values {
number_value: 0.546
}
values {
number_value: 0.322
}
values {
number_value: 0.356
}
values {
number_value: 0.507
}
values {
number_value: 0.512
}
values {
number_value: 0.023
}
values {
number_value: 0.047
}
values {
number_value: 0.923
}
values {
number_value: 0.676
}
values {
number_value: 0.933
}
values {
number_value: 0.557
}
values {
number_value: 0.542
}
values {
number_value: 0.939
}
values {
number_value: 0.679
}
values {
number_value: 0.107
}
values {
number_value: 0.747
}
values {
number_value: 0.97
}
values {
number_value: 0.397
}
values {
number_value: 0.547
}
values {
number_value: 0.043
}
values {
number_value: 0.332
}
values {
number_value: 0.194
}
values {
number_value: 0.982
}
values {
number_value: 0.902
}
values {
number_value: 0.966
}
values {
number_value: 0.715
}
values {
number_value: 0.569
}
values {
number_value: 0.108
}
values {
number_value: 0.712
}
values {
number_value: 0.798
}
values {
number_value: 0.483
}
values {
number_value: 0.635
}
values {
number_value: 0.882
}
values {
number_value: 0.873
}
values {
number_value: 0.645
}
values {
number_value: 0.806
}
values {
number_value: 0.498
}
values {
number_value: 0.879
}
values {
number_value: 0.082
}
values {
number_value: 0.264
}
values {
number_value: 0.4
}
values {
number_value: 0.953
}
values {
number_value: 0.292
}
values {
number_value: 0.62
}
values {
number_value: 0.548
}
values {
number_value: 0.206
}
values {
number_value: 0.74
}
values {
number_value: 0.028
}
values {
number_value: 0.005
}
values {
number_value: 0.569
}
values {
number_value: 0.743
}
values {
number_value: 0.987
}
values {
number_value: 0.663
}
values {
number_value: 0.26
}
values {
number_value: 0.164
}
values {
number_value: 0.556
}
values {
number_value: 0.85
}
values {
number_value: 0.188
}
values {
number_value: 0.438
}
values {
number_value: 0.434
}
values {
number_value: 0.618
}
values {
number_value: 0.655
}
values {
number_value: 0.349
}
values {
number_value: 0.422
}
values {
number_value: 0.429
}
values {
number_value: 0.306
}
values {
number_value: 0.898
}
values {
number_value: 0.026
}
values {
number_value: 0.968
}
values {
number_value: 0.009
}
values {
number_value: 0.364
}
values {
number_value: 0.742
}
values {
number_value: 0.224
}
values {
number_value: 0.431
}
values {
number_value: 0.642
}
values {
number_value: 0.607
}
values {
number_value: 0.9
}
values {
number_value: 0.598
}
values {
number_value: 0.68
}
values {
number_value: 0.304
}
values {
number_value: 0.654
}
values {
number_value: 0.448
}
values {
number_value: 0.433
}
values {
number_value: 0.259
}
values {
number_value: 0.174
}
values {
number_value: 0.461
}
values {
number_value: 0.78
}
values {
number_value: 0.977
}
values {
number_value: 0.162
}
values {
number_value: 0.601
}
values {
number_value: 0.48
}
values {
number_value: 0.683
}
values {
number_value: 0.399
}
values {
number_value: 0.203
}
values {
number_value: 0.258
}
values {
number_value: 0.429
}
values {
number_value: 0.842
}
values {
number_value: 0.65
}
values {
number_value: 0.092
}
values {
number_value: 0.849
}
values {
number_value: 0.87
}
values {
number_value: 0.256
}
values {
number_value: 0.871
}
values {
number_value: 0.543
}
values {
number_value: 0.256
}
values {
number_value: 0.494
}
values {
number_value: 0.791
}
values {
number_value: 0.292
}
values {
number_value: 0.562
}
values {
number_value: 0.699
}
values {
number_value: 0.624
}
values {
number_value: 0.242
}
values {
number_value: 0.478
}
values {
number_value: 0.002
}
values {
number_value: 0.969
}
values {
number_value: 0.219
}
values {
number_value: 0.102
}
values {
number_value: 0.354
}
values {
number_value: 0.953
}
values {
number_value: 0.932
}
values {
number_value: 0.635
}
values {
number_value: 0.365
}
values {
number_value: 0.822
}
values {
number_value: 0.485
}
values {
number_value: 0.379
}
values {
number_value: 0.477
}
values {
number_value: 0.961
}
values {
number_value: 0.063
}
values {
number_value: 0.06
}
values {
number_value: 0.107
}
values {
number_value: 0.814
}
values {
number_value: 0.234
}
values {
number_value: 0.076
}
values {
number_value: 0.289
}
values {
number_value: 0.92
}
values {
number_value: 0.738
}
values {
number_value: 0.371
}
values {
number_value: 0.684
}
values {
number_value: 0.31
}
values {
number_value: 0.802
}
values {
number_value: 0.351
}
values {
number_value: 0.536
}
values {
number_value: 0.943
}
values {
number_value: 0.569
}
values {
number_value: 0.072
}
values {
number_value: 0.954
}
values {
number_value: 0.844
}
values {
number_value: 0.542
}
values {
number_value: 0.211
}
values {
number_value: 0.808
}
values {
number_value: 0.538
}
values {
number_value: 0.337
}
values {
number_value: 0.936
}
values {
number_value: 0.079
}
values {
number_value: 0.391
}
values {
number_value: 0.211
}
values {
number_value: 0.046
}
values {
number_value: 0.269
}
values {
number_value: 0.201
}
values {
number_value: 0.457
}
values {
number_value: 0.382
}
values {
number_value: 0.416
}
values {
number_value: 0.124
}
values {
number_value: 0.023
}
values {
number_value: 0.046
}
values {
number_value: 0.775
}
values {
number_value: 0.915
}
values {
number_value: 0.795
}
values {
number_value: 0.343
}
values {
number_value: 0.342
}
values {
number_value: 0.471
}
values {
number_value: 0.214
}
values {
number_value: 0.418
}
values {
number_value: 0.166
}
values {
number_value: 0.201
}
values {
number_value: 0.859
}
values {
number_value: 0.028
}
values {
number_value: 0.618
}
values {
number_value: 0.04
}
values {
number_value: 0.862
}
values {
number_value: 0.744
}
values {
number_value: 0.629
}
values {
number_value: 0.385
}
values {
number_value: 0.83
}
values {
number_value: 0.293
}
values {
number_value: 0.006
}
values {
number_value: 0.378
}
values {
number_value: 0.704
}
values {
number_value: 0.311
}
values {
number_value: 0.693
}
values {
number_value: 0.667
}
values {
number_value: 0.446
}
values {
number_value: 0.484
}
values {
number_value: 0.983
}
values {
number_value: 0.278
}
values {
number_value: 0.487
}
values {
number_value: 0.419
}
values {
number_value: 0.961
}
values {
number_value: 0.947
}
values {
number_value: 0.371
}
values {
number_value: 0.342
}
values {
number_value: 0.449
}
values {
number_value: 0.517
}
values {
number_value: 0.494
}
values {
number_value: 0.997
}
values {
number_value: 0.874
}
values {
number_value: 0.961
}
values {
number_value: 0.014
}
values {
number_value: 0.633
}
values {
number_value: 0.229
}
values {
number_value: 0.715
}
values {
number_value: 0.712
}
values {
number_value: 0.292
}
values {
number_value: 0.532
}
values {
number_value: 0.435
}
values {
number_value: 0.71
}
values {
number_value: 0.352
}
values {
number_value: 0.265
}
values {
number_value: 0.329
}
values {
number_value: 0.907
}
values {
number_value: 0.752
}
values {
number_value: 0.485
}
values {
number_value: 0.111
}
values {
number_value: 0.163
}
values {
number_value: 0.794
}
values {
number_value: 0.65
}
values {
number_value: 0.598
}
values {
number_value: 0.364
}
values {
number_value: 0.34
}
values {
number_value: 0.052
}
values {
number_value: 0.66
}
values {
number_value: 0.812
}
values {
number_value: 0.422
}
values {
number_value: 0.412
}
values {
number_value: 0.292
}
values {
number_value: 0.171
}
values {
number_value: 0.058
}
values {
number_value: 0.041
}
values {
number_value: 0.785
}
values {
number_value: 0.065
}
values {
number_value: 0.105
}
values {
number_value: 0.643
}
values {
number_value: 0.905
}
values {
number_value: 0.409
}
values {
number_value: 0.21
}
values {
number_value: 0.25
}
values {
number_value: 0.065
}
values {
number_value: 0.763
}
values {
number_value: 0.489
}
values {
number_value: 0.143
}
values {
number_value: 0.104
}
values {
number_value: 0.361
}
values {
number_value: 0.53
}
values {
number_value: 0.348
}
values {
number_value: 0.647
}
values {
number_value: 0.456
}
values {
number_value: 0.564
}
values {
number_value: 0.48
}
values {
number_value: 0.064
}
values {
number_value: 0.927
}
values {
number_value: 0.217
}
values {
number_value: 0.088
}
values {
number_value: 0.962
}
values {
number_value: 0.943
}
values {
number_value: 0.659
}
values {
number_value: 0.371
}
values {
number_value: 0.074
}
values {
number_value: 0.017
}
values {
number_value: 0.748
}
values {
number_value: 0.23
}
values {
number_value: 0.137
}
values {
number_value: 0.725
}
values {
number_value: 0.098
}
values {
number_value: 0.841
}
values {
number_value: 0.13
}
values {
number_value: 0.149
}
values {
number_value: 0.586
}
values {
number_value: 0.858
}
values {
number_value: 0.688
}
values {
number_value: 0.43
}
values {
number_value: 0.094
}
values {
number_value: 0.526
}
values {
number_value: 0.631
}
values {
number_value: 0.346
}
values {
number_value: 0.729
}
values {
number_value: 0.274
}
values {
number_value: 0.335
}
values {
number_value: 0.194
}
values {
number_value: 0.462
}
values {
number_value: 0.088
}
values {
number_value: 0.239
}
values {
number_value: 0.71
}
values {
number_value: 0.641
}
values {
number_value: 0.371
}
values {
number_value: 0.103
}
values {
number_value: 0.409
}
values {
number_value: 0.79
}
values {
number_value: 0.34
}
values {
number_value: 0.294
}
values {
number_value: 0.625
}
values {
number_value: 0.253
}
values {
number_value: 0.024
}
values {
number_value: 0.422
}
values {
number_value: 0.415
}
values {
number_value: 0.948
}
values {
number_value: 0.107
}
values {
number_value: 0.954
}
values {
number_value: 0.738
}
values {
number_value: 0.204
}
values {
number_value: 0.896
}
values {
number_value: 0.758
}
values {
number_value: 0.685
}
values {
number_value: 0.801
}
values {
number_value: 0.037
}
values {
number_value: 0.63
}
values {
number_value: 0.58
}
values {
number_value: 0.639
}
values {
number_value: 0.982
}
values {
number_value: 0.475
}
values {
number_value: 0.986
}
values {
number_value: 0.06
}
values {
number_value: 0.797
}
values {
number_value: 0.334
}
values {
number_value: 0.829
}
values {
number_value: 0.384
}
values {
number_value: 0.634
}
values {
number_value: 0.728
}
values {
number_value: 0.489
}
values {
number_value: 0.498
}
values {
number_value: 0.805
}
values {
number_value: 0.024
}
values {
number_value: 0.796
}
values {
number_value: 0.78
}
values {
number_value: 0.852
}
values {
number_value: 0.516
}
values {
number_value: 0.905
}
values {
number_value: 0.047
}
values {
number_value: 0.964
}
values {
number_value: 0.248
}
values {
number_value: 0.459
}
values {
number_value: 0.819
}
values {
number_value: 0.52
}
values {
number_value: 0.623
}
values {
number_value: 0.482
}
values {
number_value: 0.056
}
values {
number_value: 0.259
}
values {
number_value: 0.141
}
values {
number_value: 0.943
}
values {
number_value: 0.909
}
values {
number_value: 0.759
}
values {
number_value: 0.399
}
values {
number_value: 0.609
}
values {
number_value: 0.412
}
values {
number_value: 0.963
}
values {
number_value: 0.977
}
values {
number_value: 0.148
}
values {
number_value: 0.048
}
values {
number_value: 0.914
}
values {
number_value: 0.942
}
values {
number_value: 0.462
}
values {
number_value: 0.053
}
values {
number_value: 0.741
}
values {
number_value: 0.55
}
values {
number_value: 0.188
}
values {
number_value: 0.897
}
values {
number_value: 0.685
}
values {
number_value: 0.826
}
values {
number_value: 0.875
}
values {
number_value: 0.126
}
values {
number_value: 0.194
}
values {
number_value: 0.743
}
values {
number_value: 0.786
}
values {
number_value: 0.265
}
values {
number_value: 0.364
}
values {
number_value: 0.003
}
values {
number_value: 0.442
}
values {
number_value: 0.589
}
values {
number_value: 0.145
}
values {
number_value: 0.179
}
values {
number_value: 0.255
}
values {
number_value: 0.842
}
values {
number_value: 0.338
}
values {
number_value: 0.223
}
values {
number_value: 0.311
}
values {
number_value: 0.079
}
values {
number_value: 0.362
}
values {
number_value: 0.042
}
values {
number_value: 0.324
}
values {
number_value: 0.459
}
values {
number_value: 0.092
}
values {
number_value: 0.203
}
values {
number_value: 0.908
}
values {
number_value: 0.563
}
values {
number_value: 0.112
}
values {
number_value: 0.715
}
values {
number_value: 0.214
}
values {
number_value: 0.099
}
values {
number_value: 0.67
}
values {
number_value: 0.156
}
values {
number_value: 0.84
}
values {
number_value: 0.475
}
values {
number_value: 0.88
}
values {
number_value: 0.555
}
values {
number_value: 0.165
}
values {
number_value: 0.907
}
values {
number_value: 0.473
}
values {
number_value: 0.014
}
values {
number_value: 0.236
}
values {
number_value: 0.454
}
values {
number_value: 0.061
}
values {
number_value: 0.466
}
values {
number_value: 0.059
}
values {
number_value: 0.696
}
values {
number_value: 0.059
}
values {
number_value: 0.518
}
values {
number_value: 0.142
}
values {
number_value: 0.852
}
values {
number_value: 0.781
}
values {
number_value: 0.736
}
values {
number_value: 0.4
}
values {
number_value: 0.271
}
values {
number_value: 0.222
}
values {
number_value: 0.296
}
values {
number_value: 0.031
}
values {
number_value: 0.944
}
values {
number_value: 0.874
}
values {
number_value: 0.034
}
values {
number_value: 0.029
}
values {
number_value: 0.771
}
values {
number_value: 0.906
}
values {
number_value: 0.326
}
values {
number_value: 0.662
}
values {
number_value: 0.668
}
values {
number_value: 0.31
}
values {
number_value: 0.844
}
values {
number_value: 0.559
}
values {
number_value: 0.592
}
values {
number_value: 0.647
}
values {
number_value: 0.471
}
values {
number_value: 0.089
}
values {
number_value: 0.267
}
values {
number_value: 0.759
}
values {
number_value: 0.512
}
values {
number_value: 0.693
}
values {
number_value: 0.24
}
values {
number_value: 0.505
}
values {
number_value: 0.646
}
values {
number_value: 0.751
}
values {
number_value: 0.552
}
values {
number_value: 0.746
}
values {
number_value: 0.251
}
values {
number_value: 0.621
}
values {
number_value: 0.495
}
values {
number_value: 0.571
}
values {
number_value: 0.497
}
values {
number_value: 0.843
}
values {
number_value: 0.155
}
values {
number_value: 0.038
}
values {
number_value: 0.4
}
values {
number_value: 0.749
}
values {
number_value: 0.217
}
values {
number_value: 0.907
}
values {
number_value: 0.605
}
values {
number_value: 0.965
}
values {
number_value: 0.781
}
values {
number_value: 0.548
}
values {
number_value: 0.346
}
values {
number_value: 0.875
}
values {
number_value: 0.2
}
values {
number_value: 0.529
}
values {
number_value: 0.499
}
values {
number_value: 0.858
}
values {
number_value: 0.569
}
values {
number_value: 0.597
}
values {
number_value: 0.241
}
}
}
}
}
Response:
meta {
}
data {
names: "class:0"
names: "class:1"
names: "class:2"
names: "class:3"
names: "class:4"
names: "class:5"
names: "class:6"
names: "class:7"
names: "class:8"
names: "class:9"
ndarray {
values {
list_value {
values {
number_value: 0.001586819882504642
}
values {
number_value: 5.388684485296835e-07
}
values {
number_value: 0.0681319385766983
}
values {
number_value: 0.39322274923324585
}
values {
number_value: 2.890761606977321e-06
}
values {
number_value: 0.4975513219833374
}
values {
number_value: 0.0009588279062882066
}
values {
number_value: 0.0003235357580706477
}
values {
number_value: 0.037815213203430176
}
values {
number_value: 0.00040610713767819107
}
}
}
}
}
[5]:
!docker rm mnist_predictor --force
mnist_predictor
Test using MinikubeΒΆ
Due to a `minikube/s2i issue <https://github.com/SeldonIO/seldon-core/issues/253>`__ you will need `s2i >= 1.1.13 <https://github.com/openshift/source-to-image/releases/tag/v1.1.13>`__
[6]:
!minikube start --memory 4096
π minikube v0.34.1 on linux (amd64)
π₯ Creating virtualbox VM (CPUs=2, Memory=4096MB, Disk=20000MB) ...
πΆ "minikube" IP address is 192.168.99.100
π³ Configuring Docker as the container runtime ...
β¨ Preparing Kubernetes environment ...
π Pulling images required by Kubernetes v1.13.3 ...
π Launching Kubernetes v1.13.3 using kubeadm ...
π Configuring cluster permissions ...
π€ Verifying component health .....
π kubectl is now configured to use "minikube"
π Done! Thank you for using minikube!
[7]:
!kubectl create clusterrolebinding kube-system-cluster-admin --clusterrole=cluster-admin --serviceaccount=kube-system:default
clusterrolebinding.rbac.authorization.k8s.io/kube-system-cluster-admin created
[8]:
!helm init
$HELM_HOME has been configured at /home/clive/.helm.
Tiller (the Helm server-side component) has been installed into your Kubernetes Cluster.
Please note: by default, Tiller is deployed with an insecure 'allow unauthenticated users' policy.
To prevent this, run `helm init` with the --tiller-tls-verify flag.
For more information on securing your installation see: https://docs.helm.sh/using_helm/#securing-your-helm-installation
Happy Helming!
[9]:
!kubectl rollout status deploy/tiller-deploy -n kube-system
Waiting for deployment "tiller-deploy" rollout to finish: 0 of 1 updated replicas are available...
deployment "tiller-deploy" successfully rolled out
[10]:
!helm install ../../../helm-charts/seldon-core-crd --name seldon-core-crd --set usage_metrics.enabled=true
!helm install ../../../helm-charts/seldon-core --name seldon-core
NAME: seldon-core-crd
LAST DEPLOYED: Wed Mar 13 09:35:37 2019
NAMESPACE: default
STATUS: DEPLOYED
RESOURCES:
==> v1/ServiceAccount
NAME SECRETS AGE
seldon-spartakus-volunteer 1 0s
==> v1beta1/ClusterRole
NAME AGE
seldon-spartakus-volunteer 0s
==> v1beta1/ClusterRoleBinding
NAME AGE
seldon-spartakus-volunteer 0s
==> v1/ConfigMap
NAME DATA AGE
seldon-spartakus-config 3 5s
==> v1beta1/CustomResourceDefinition
NAME AGE
seldondeployments.machinelearning.seldon.io 1s
==> v1beta1/Deployment
NAME DESIRED CURRENT UP-TO-DATE AVAILABLE AGE
seldon-spartakus-volunteer 1 0 0 0 1s
NOTES:
NOTES: TODO
NAME: seldon-core
LAST DEPLOYED: Wed Mar 13 09:35:42 2019
NAMESPACE: default
STATUS: DEPLOYED
RESOURCES:
==> v1/Role
NAME AGE
seldon-local 0s
==> v1/RoleBinding
NAME AGE
seldon 0s
==> v1/Service
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
seldon-core-seldon-apiserver NodePort 10.107.198.43 <none> 8080:31692/TCP,5000:30651/TCP 0s
seldon-core-redis ClusterIP 10.104.104.7 <none> 6379/TCP 0s
==> v1beta1/Deployment
NAME DESIRED CURRENT UP-TO-DATE AVAILABLE AGE
seldon-core-seldon-apiserver 1 1 1 0 0s
seldon-core-seldon-cluster-manager 1 1 1 0 0s
seldon-core-redis 1 1 1 0 0s
==> v1/Pod(related)
NAME READY STATUS RESTARTS AGE
seldon-core-seldon-apiserver-7c9898d988-s8rq9 0/1 ContainerCreating 0 0s
seldon-core-seldon-cluster-manager-68ff4ccfcf-jq6l4 0/1 ContainerCreating 0 0s
seldon-core-redis-7d64dc686b-wfmhp 0/1 ContainerCreating 0 0s
==> v1/ServiceAccount
NAME SECRETS AGE
seldon 1 0s
NOTES:
Thank you for installing Seldon Core.
Documentation can be found at https://github.com/SeldonIO/seldon-core
[11]:
!eval $(minikube docker-env) && s2i build . seldonio/seldon-core-s2i-python2:0.5.1 deep-mnist:0.1
---> Installing application source...
---> Installing dependencies ...
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Looking in links: /whl
Requirement already satisfied: tensorflow>=1.12.0 in /usr/local/lib/python2.7/site-packages (from -r requirements.txt (line 1)) (1.12.0)
Requirement already satisfied: astor>=0.6.0 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.7.1)
Requirement already satisfied: keras-preprocessing>=1.0.5 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.9)
Requirement already satisfied: gast>=0.2.0 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.2.2)
Requirement already satisfied: enum34>=1.1.6 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.1.6)
Requirement already satisfied: protobuf>=3.6.1 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.6.1)
Requirement already satisfied: six>=1.10.0 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.12.0)
Requirement already satisfied: absl-py>=0.1.6 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.7.0)
Requirement already satisfied: backports.weakref>=1.0rc1 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.post1)
Requirement already satisfied: wheel in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.33.1)
Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.1.0)
Requirement already satisfied: tensorboard<1.13.0,>=1.12.0 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.12.2)
Requirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.16.1)
Requirement already satisfied: mock>=2.0.0 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (2.0.0)
Requirement already satisfied: keras-applications>=1.0.6 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.7)
Requirement already satisfied: grpcio>=1.8.6 in /usr/local/lib/python2.7/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.18.0)
Requirement already satisfied: setuptools in /usr/local/lib/python2.7/site-packages (from protobuf>=3.6.1->tensorflow>=1.12.0->-r requirements.txt (line 1)) (40.8.0)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python2.7/site-packages (from tensorboard<1.13.0,>=1.12.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.0.1)
Requirement already satisfied: futures>=3.1.1; python_version < "3" in /usr/local/lib/python2.7/site-packages (from tensorboard<1.13.0,>=1.12.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.2.0)
Requirement already satisfied: werkzeug>=0.11.10 in /usr/local/lib/python2.7/site-packages (from tensorboard<1.13.0,>=1.12.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.14.1)
Requirement already satisfied: funcsigs>=1; python_version < "3.3" in /usr/local/lib/python2.7/site-packages (from mock>=2.0.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.2)
Requirement already satisfied: pbr>=0.11 in /usr/local/lib/python2.7/site-packages (from mock>=2.0.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (5.1.2)
Requirement already satisfied: h5py in /usr/local/lib/python2.7/site-packages (from keras-applications>=1.0.6->tensorflow>=1.12.0->-r requirements.txt (line 1)) (2.9.0)
Url '/whl' is ignored. It is either a non-existing path or lacks a specific scheme.
Build completed successfully
[12]:
!kubectl create -f deep_mnist.json
seldondeployment.machinelearning.seldon.io/deep-mnist created
[14]:
!kubectl rollout status deploy/deep-mnist-single-model-8969cc0
Waiting for deployment "deep-mnist-single-model-8969cc0" rollout to finish: 0 of 1 updated replicas are available...
deployment "deep-mnist-single-model-8969cc0" successfully rolled out
[15]:
!seldon-core-api-tester contract.json \
`minikube ip` `kubectl get svc -l app=seldon-apiserver-container-app -o jsonpath='{.items[0].spec.ports[0].nodePort}'` \
--oauth-key oauth-key --oauth-secret oauth-secret -p
RECEIVED RESPONSE:
Success:True message:
Request:
data {
tensor {
shape: 1
shape: 784
values: 0.389
values: 0.778
values: 0.371
values: 0.678
values: 0.462
values: 0.516
values: 0.349
values: 0.639
values: 0.19
values: 0.31
values: 0.53
values: 0.955
values: 0.719
values: 0.031
values: 0.641
values: 0.095
values: 0.444
values: 0.118
values: 0.435
values: 0.573
values: 0.507
values: 0.599
values: 0.266
values: 0.159
values: 0.45
values: 0.64
values: 0.841
values: 0.027
values: 0.408
values: 0.17
values: 0.602
values: 0.511
values: 0.933
values: 0.178
values: 0.176
values: 0.877
values: 0.06
values: 0.368
values: 0.25
values: 0.121
values: 0.178
values: 0.308
values: 0.015
values: 0.686
values: 0.657
values: 0.833
values: 0.076
values: 0.562
values: 0.194
values: 0.327
values: 0.441
values: 0.58
values: 0.972
values: 0.805
values: 0.709
values: 0.26
values: 0.779
values: 0.819
values: 0.194
values: 0.485
values: 0.124
values: 0.874
values: 0.347
values: 0.437
values: 0.241
values: 0.173
values: 0.206
values: 0.588
values: 0.998
values: 0.402
values: 0.458
values: 0.882
values: 0.929
values: 0.75
values: 0.644
values: 0.177
values: 0.261
values: 0.448
values: 0.421
values: 0.845
values: 0.941
values: 0.972
values: 0.253
values: 0.173
values: 0.021
values: 0.625
values: 0.618
values: 0.145
values: 0.168
values: 0.162
values: 0.634
values: 0.115
values: 0.825
values: 0.38
values: 0.945
values: 0.807
values: 0.761
values: 0.014
values: 0.384
values: 0.796
values: 0.382
values: 0.29
values: 0.075
values: 0.265
values: 0.33
values: 0.298
values: 0.692
values: 0.656
values: 0.726
values: 0.711
values: 0.384
values: 0.154
values: 0.501
values: 0.928
values: 0.123
values: 0.677
values: 0.805
values: 0.094
values: 0.598
values: 0.784
values: 0.654
values: 0.97
values: 0.198
values: 0.855
values: 0.015
values: 0.333
values: 0.332
values: 0.395
values: 0.31
values: 0.145
values: 0.53
values: 0.833
values: 0.278
values: 0.836
values: 0.876
values: 0.15
values: 0.425
values: 0.903
values: 0.744
values: 0.973
values: 0.413
values: 0.565
values: 0.249
values: 0.64
values: 0.066
values: 0.138
values: 0.281
values: 0.888
values: 0.238
values: 0.968
values: 0.234
values: 0.922
values: 0.475
values: 0.067
values: 0.535
values: 0.064
values: 0.472
values: 0.345
values: 0.233
values: 0.199
values: 0.425
values: 0.496
values: 0.5
values: 0.291
values: 1.0
values: 0.135
values: 0.015
values: 0.302
values: 0.689
values: 0.184
values: 0.796
values: 0.934
values: 0.924
values: 0.446
values: 0.928
values: 0.528
values: 0.127
values: 0.073
values: 0.29
values: 0.343
values: 0.017
values: 0.953
values: 0.821
values: 0.582
values: 0.133
values: 0.91
values: 0.363
values: 0.532
values: 0.681
values: 0.994
values: 0.197
values: 0.252
values: 0.292
values: 0.25
values: 0.592
values: 0.967
values: 0.303
values: 0.494
values: 0.149
values: 0.285
values: 0.084
values: 0.782
values: 0.829
values: 0.584
values: 0.674
values: 0.066
values: 0.147
values: 0.639
values: 0.611
values: 0.206
values: 0.064
values: 0.722
values: 0.687
values: 0.101
values: 0.08
values: 0.833
values: 0.235
values: 0.864
values: 0.148
values: 0.173
values: 0.48
values: 0.786
values: 0.332
values: 0.42
values: 0.877
values: 0.556
values: 0.363
values: 0.468
values: 0.108
values: 0.591
values: 0.057
values: 0.094
values: 0.022
values: 0.964
values: 0.73
values: 0.908
values: 0.381
values: 0.938
values: 0.053
values: 0.131
values: 0.565
values: 0.706
values: 0.101
values: 0.258
values: 0.519
values: 0.421
values: 0.745
values: 0.956
values: 0.42
values: 0.42
values: 0.386
values: 0.348
values: 0.635
values: 0.857
values: 0.336
values: 0.135
values: 0.965
values: 0.779
values: 0.943
values: 0.072
values: 0.533
values: 0.85
values: 0.132
values: 0.009
values: 0.7
values: 0.71
values: 0.927
values: 0.852
values: 0.813
values: 0.152
values: 0.486
values: 0.26
values: 0.397
values: 0.909
values: 0.719
values: 0.369
values: 0.273
values: 0.362
values: 0.792
values: 0.894
values: 0.922
values: 0.33
values: 0.415
values: 0.181
values: 0.348
values: 0.794
values: 0.585
values: 0.418
values: 0.482
values: 0.264
values: 0.844
values: 0.111
values: 0.575
values: 0.873
values: 0.606
values: 0.767
values: 0.812
values: 0.465
values: 0.375
values: 0.928
values: 0.71
values: 0.228
values: 0.223
values: 0.137
values: 0.301
values: 0.731
values: 0.532
values: 0.351
values: 0.979
values: 0.765
values: 0.295
values: 0.196
values: 0.963
values: 0.206
values: 0.04
values: 0.982
values: 0.249
values: 0.92
values: 0.973
values: 0.478
values: 0.706
values: 0.572
values: 0.371
values: 0.347
values: 0.382
values: 0.142
values: 0.837
values: 0.155
values: 0.533
values: 0.073
values: 0.993
values: 0.311
values: 0.936
values: 0.317
values: 0.175
values: 0.688
values: 0.036
values: 0.645
values: 0.819
values: 0.772
values: 0.803
values: 0.076
values: 0.282
values: 0.28
values: 0.801
values: 0.635
values: 0.606
values: 0.091
values: 0.114
values: 0.51
values: 0.211
values: 0.515
values: 0.512
values: 0.818
values: 0.213
values: 0.71
values: 0.361
values: 0.944
values: 0.41
values: 0.81
values: 0.33
values: 0.026
values: 0.743
values: 0.895
values: 0.539
values: 0.003
values: 0.582
values: 0.793
values: 0.758
values: 0.99
values: 0.85
values: 0.936
values: 0.544
values: 0.331
values: 0.554
values: 0.501
values: 0.537
values: 0.287
values: 0.69
values: 0.906
values: 0.828
values: 0.912
values: 0.019
values: 0.277
values: 0.932
values: 0.586
values: 0.304
values: 0.688
values: 0.661
values: 0.399
values: 0.218
values: 0.518
values: 0.412
values: 0.631
values: 0.152
values: 0.249
values: 0.465
values: 0.449
values: 0.724
values: 0.976
values: 0.531
values: 0.281
values: 0.129
values: 0.013
values: 0.97
values: 0.349
values: 0.442
values: 0.832
values: 0.155
values: 0.908
values: 0.735
values: 0.509
values: 0.439
values: 0.664
values: 0.607
values: 0.727
values: 0.113
values: 0.532
values: 0.601
values: 0.141
values: 0.267
values: 0.651
values: 0.548
values: 0.829
values: 0.327
values: 0.797
values: 0.039
values: 0.371
values: 0.866
values: 0.953
values: 0.332
values: 0.367
values: 0.019
values: 0.98
values: 0.534
values: 0.604
values: 0.921
values: 0.606
values: 0.527
values: 0.367
values: 0.89
values: 0.763
values: 0.742
values: 0.79
values: 0.4
values: 0.575
values: 0.288
values: 0.977
values: 0.523
values: 0.037
values: 0.562
values: 0.218
values: 0.564
values: 0.495
values: 0.44
values: 0.105
values: 0.561
values: 0.623
values: 0.051
values: 0.492
values: 0.726
values: 0.533
values: 0.859
values: 0.806
values: 0.77
values: 0.652
values: 0.572
values: 0.508
values: 0.085
values: 0.396
values: 0.265
values: 0.179
values: 0.15
values: 0.633
values: 0.63
values: 0.482
values: 0.365
values: 0.156
values: 0.416
values: 0.288
values: 0.083
values: 0.273
values: 0.494
values: 0.716
values: 0.135
values: 0.137
values: 0.038
values: 0.736
values: 0.246
values: 0.635
values: 0.276
values: 0.802
values: 0.87
values: 0.853
values: 0.518
values: 0.121
values: 0.8
values: 0.722
values: 0.849
values: 0.327
values: 0.008
values: 0.524
values: 0.912
values: 0.257
values: 0.207
values: 0.342
values: 0.265
values: 0.508
values: 0.118
values: 0.271
values: 0.016
values: 0.823
values: 0.324
values: 0.199
values: 0.951
values: 0.091
values: 0.743
values: 0.554
values: 0.593
values: 0.08
values: 0.113
values: 0.836
values: 0.791
values: 0.853
values: 0.759
values: 0.345
values: 0.456
values: 0.995
values: 0.579
values: 0.354
values: 0.68
values: 0.876
values: 0.107
values: 0.714
values: 0.272
values: 0.766
values: 0.245
values: 0.808
values: 0.042
values: 0.67
values: 0.022
values: 0.324
values: 0.116
values: 0.797
values: 0.268
values: 0.587
values: 0.93
values: 0.242
values: 0.096
values: 0.731
values: 0.191
values: 0.323
values: 0.329
values: 0.969
values: 0.541
values: 0.452
values: 0.603
values: 0.603
values: 0.813
values: 0.615
values: 0.006
values: 0.266
values: 0.315
values: 0.048
values: 0.348
values: 0.796
values: 0.355
values: 0.464
values: 0.343
values: 0.894
values: 0.27
values: 0.7
values: 0.896
values: 0.482
values: 0.134
values: 0.873
values: 0.676
values: 0.212
values: 0.852
values: 0.963
values: 0.725
values: 0.199
values: 0.072
values: 0.822
values: 0.645
values: 0.517
values: 0.461
values: 0.062
values: 0.82
values: 0.7
values: 0.188
values: 0.671
values: 0.692
values: 0.762
values: 0.653
values: 0.259
values: 0.362
values: 0.666
values: 0.782
values: 0.745
values: 0.763
values: 0.184
values: 0.588
values: 0.829
values: 0.52
values: 0.167
values: 0.637
values: 0.899
values: 0.324
values: 0.641
values: 0.447
values: 0.955
values: 0.633
values: 0.796
values: 0.879
values: 0.052
values: 0.028
values: 0.812
values: 0.294
values: 0.781
values: 0.862
values: 0.664
values: 0.939
values: 0.009
values: 0.169
values: 0.87
values: 0.057
values: 0.819
values: 0.159
values: 0.563
values: 0.404
values: 0.236
values: 0.651
values: 0.165
values: 0.127
values: 0.23
values: 0.852
values: 0.715
values: 0.55
values: 0.185
values: 0.014
values: 0.752
values: 0.396
values: 0.232
values: 0.514
values: 0.451
values: 0.852
values: 0.634
values: 0.156
values: 0.26
values: 0.098
values: 0.084
values: 0.512
values: 0.378
values: 0.554
values: 0.304
values: 0.596
values: 0.208
values: 0.491
values: 0.83
values: 0.232
values: 0.279
values: 0.5
values: 0.647
values: 0.627
values: 0.047
values: 0.982
values: 0.244
values: 0.64
values: 0.234
values: 0.159
values: 0.286
values: 0.148
values: 0.529
values: 0.669
values: 0.71
values: 0.188
values: 0.862
values: 0.084
values: 0.486
values: 0.565
values: 0.327
values: 0.271
values: 0.332
values: 0.86
values: 0.036
values: 0.207
values: 0.46
values: 0.96
values: 0.377
values: 0.465
values: 0.984
values: 0.121
values: 0.501
values: 0.289
values: 0.426
values: 0.231
values: 0.488
values: 0.973
values: 0.138
values: 0.436
values: 0.688
values: 0.453
values: 0.826
values: 0.474
values: 0.669
values: 0.968
values: 0.357
values: 0.478
values: 0.199
values: 0.603
values: 0.86
values: 0.54
values: 0.789
values: 0.54
values: 0.232
values: 0.33
values: 0.576
values: 0.007
values: 0.421
values: 0.483
values: 0.898
values: 0.326
values: 0.957
values: 0.586
values: 0.512
values: 0.68
values: 0.885
values: 0.594
values: 0.334
values: 0.958
values: 0.018
values: 0.621
values: 0.153
values: 0.798
values: 0.277
values: 0.938
values: 0.678
values: 0.235
values: 0.168
values: 0.739
values: 0.932
values: 0.358
values: 0.972
values: 0.194
values: 0.634
values: 0.42
values: 0.866
values: 0.336
values: 0.432
values: 0.533
values: 0.67
values: 0.392
values: 0.035
values: 0.309
values: 0.992
values: 0.417
values: 0.385
values: 0.553
values: 0.355
values: 0.868
values: 0.844
values: 0.024
values: 0.953
values: 0.317
values: 0.7
values: 0.835
values: 0.231
values: 0.598
values: 0.441
values: 0.986
values: 0.988
values: 0.829
values: 0.039
values: 0.067
values: 0.092
values: 0.705
values: 0.131
values: 0.154
values: 0.335
values: 0.619
values: 0.214
values: 0.762
values: 0.419
}
}
Response:
meta {
puid: "ivo4qrm59vikikiu5h852lsanv"
requestPath {
key: "classifier"
value: "deep-mnist:0.1"
}
}
data {
names: "class:0"
names: "class:1"
names: "class:2"
names: "class:3"
names: "class:4"
names: "class:5"
names: "class:6"
names: "class:7"
names: "class:8"
names: "class:9"
tensor {
shape: 1
shape: 10
values: 0.000305197638226673
values: 2.2940020016903873e-07
values: 0.17768406867980957
values: 0.3751635253429413
values: 1.6084664821391925e-05
values: 0.42158570885658264
values: 0.0020625267643481493
values: 0.0028135962784290314
values: 0.019332798197865486
values: 0.001036234200000763
}
}
[16]:
!minikube delete
π₯ Deleting "minikube" from virtualbox ...
π The "minikube" cluster has been deleted.
[ ]: