BERT is a language model introduced by Google in 2018 that has become widely used in Natural Language Processing (NLP) research. It comes in two sizes, BERTBASE and BERTLARGE, with different numbers of encoders and self-attention heads. Both models were pre-trained on large amounts of English text data.
University of Washington
Autumn 2019
A survey course on neural network implementation and applications, including image processing, classification, detection, and segmentation. The course also covers semantic understanding, translation, and question-answering applications. It's ideal for those with a background in Machine Learning, Neural Networks, Optimization, and CNNs.
No concepts data
+ 13 more concepts