How to customize TensorFlow Serving

TensorFlow Serving supports many additional features that you can leverage. Wei Wei, Developer Advocate at Google, discusses how you can customize TensorFlow Serving for your needs. He covers its integration with monitor tools, support for custom ops, how to configure the TF Serving model server, and more. Learn how TF Serving supports basic A/B tests and seamlessly integrates with Docker and Kubernetes to scale with demand.

Resources:
TensorFlow Serving configuration → https://goo.gle/3QxxvIF
Prometheus → https://goo.gle/3N8XCmi
Monitoring configuration → https://goo.gle/3N8XCmi
Use TensorFlow Serving with Kubernetes → https://goo.gle/3zYvX4B
Serving TensorFlow models with custom ops → https://goo.gle/3y7JvsY
TensorFlow Serving model server flags → https://goo.gle/3tQ99Qq
TensorFlow Serving metrics documentation → https://goo.gle/3NeAjY6
Docker Compose documentation →https://goo.gle/3xRuDxw

Deploying Production ML Models with TensorFlow Serving playlist → https://goo.gle/tf-serving
Subscribe to TensorFlow → https://goo.gle/TensorFlow

#TensorFlow #MachineLearning #ML

Source of this TensorFlow AI Video

AI video(s) you might be interested in …