How to customize TensorFlow Serving

TensorFlow Serving supports many additional features that you can leverage. Wei Wei, Developer Advocate at Google, discusses how you can customize TensorFlow Serving for your needs. He covers its integration with monitor tools, support for custom ops, how to configure the TF Serving model server, and more. Learn how TF Serving supports basic A/B tests and seamlessly integrates with Docker and Kubernetes to scale with demand.

TensorFlow Serving configuration →
Prometheus →
Monitoring configuration →
Use TensorFlow Serving with Kubernetes →
Serving TensorFlow models with custom ops →
TensorFlow Serving model server flags →
TensorFlow Serving metrics documentation →
Docker Compose documentation →

Deploying Production ML Models with TensorFlow Serving playlist →
Subscribe to TensorFlow →

#TensorFlow #MachineLearning #ML

Source of this TensorFlow AI Video

AI video(s) you might be interested in …