Cloud TPU Pods: AI Supercomputing for Large Machine Learning Problems (Google I/O'19)

Cloud Tensor Processing Unit (TPU) is an ASIC designed by Google for neural network processing. TPUs feature a domain specific architecture designed specifically for accelerating TensorFlow training and prediction workloads and provides performance benefits on machine learning production use. Learn the technical details of Cloud TPU and Cloud TPU Pod and new features of TensorFlow that enables a large scale model parallelism for deep learning training.

Watch more #io19 here: Machine Learning at Google I/O 2019 Playlist → https://goo.gle/2URpjol
TensorFlow at Google I/O 2019 Playlist → http://bit.ly/2GW7ZJM
Google I/O 2019 All Sessions Playlist → https://goo.gle/io19allsessions
Learn more on the I/O Website → https://google.com/io

Subscribe to the TensorFlow Channel → https://bit.ly/TensorFlow1
Get started at → https://www.tensorflow.org/

Speaker(s): Kaz Sato and Martin Gorner

TF6510 event: Google I/O 2019; re_ty: Publish; fullname: Kaz Sato, Martin Gorner;

Source of this TensorFlow AI Video

AI video(s) you might be interested in …