Tensor Processing Units (TPUs) are AI accelerators developed by Google for neural network machine learning. They were first used internally in 2015 and are now available to third parties as part of Google's cloud infrastructure or as a smaller version of the chip for sale. TPUs use Google's own TensorFlow software.
Stanford University
Fall 2022
Focused on principles and trade-offs in designing modern parallel computing systems, this course also teaches parallel programming techniques. It is intended for students looking to understand both parallel hardware and software design. Prerequisite knowledge in computer systems is required.
No concepts data
+ 45 more concepts