Update January 2018
- Seldon Core open sourced.
- Seldon Core focuses purely on deploying a wide range of ML models on Kubernetes, allowing complex runtime serving graphs to be managed in production. Seldon Core is a progression of the goals of the Seldon-Server project but also a more restricted focus to solving the final step in a machine learning project which is serving models in production. Please have a look at the project page which includes extensive documentation to investigate further.
Seldon Server is a machine learning platform that helps your data science team train and deploy models into production.
It provides an open-source data science stack that runs within a Kubernetes Cluster. You can use Seldon to deploy machine learning and deep learning models into production on-premise or in the cloud (e.g. GCP, AWS, Azure).
Seldon supports models built with TensorFlow, Keras, Vowpal Wabbit, XGBoost, Gensim and any other model-building tool — it even supports models built with commercial tools and services where the model is exportable.
It includes an API with two key endpoints:
- Predict - Build and deploy supervised machine learning models created in any machine learning library or framework at scale using containers and microservices.
- Recommend - High-performance user activity and content based recommendation engine with various algorithms ready to run out of the box.
Other features include:
- Complex dynamic algorithm configuration and combination with no downtime: run A/B and Multivariate tests, cascade algorithms and create ensembles.
- Command Line Interface (CLI) for configuring and managing Seldon Server.
- Secure OAuth 2.0 REST and gRPC APIs to streamline integration with your data and application.
- Grafana dashboard for real-time analytics built with Kafka Streams, Fluentd and InfluxDB.
Seldon is used by some of the world’s most innovative organisations — it’s the perfect machine learning deployment platform for start-ups and can scale to meet the demands of large enterprises.
It takes a few minutes to install Seldon on a Kubernertes cluster. Visit our install guide.
Community & Support
- Join the Seldon Users Group.
- Register for our newsletter to be the first to receive updates about our products and events.
- Visit our website, follow @seldon_io on Twitter and like our Facebook page.
- If you’re in London, meet us at TensorFlow London - a community of over 1200 data scientists that we co-organise.
- We also offer commercial support plans and managed services.
Seldon is available under Apache Licence, Version 2.0