Residual Neural Networks (ResNets) are deep learning models with skip connections that perform identity mappings and layer outputs added together. This allows for deep learning models with many layers to train easily and achieve better accuracy. ResNets were developed by Kaiming He et al. and won the ImageNet 2015 competition.
Stanford University
Fall 2022
An in-depth course focused on building neural networks and leading successful machine learning projects. It covers Convolutional Networks, RNNs, LSTM, Adam, Dropout, BatchNorm, Xavier/He initialization, and more. Students are expected to have basic computer science skills, probability theory knowledge, and linear algebra familiarity.
No concepts data
+ 35 more concepts