Week 11 – Lecture: PyTorch activation and loss functions

0:00:00 – Week 11 – Lecture

LECTURE Part A: http://bit.ly/pDL-en-11-1
In this section, we discussed about the common activation functions in Pytorch. In particular, we compared activations with kink(s) versus smooth activations - the former is preferred in a deep neural network as the latter might suffer with gradient vanishing problem. We then learned about the common loss functions in Pytorch.
0:00:15 – Activation Functions
0:14:21 – Q&A of activation
0:33:10 – Loss Functions (until AdaptiveLogSoftMax)

LECTURE Part B: http://bit.ly/pDL-en-11-2
In this section, we continued to learn about loss functions - in particular, margin-based losses and their applications. We then discussed how to design a good loss function for EBMs as well as examples of well-known EBM loss functions. We gave particular attention to margin-based loss function here, as well as explaining the idea of “most offending incorrect answer.
0:53:27 – Loss Functions (until CosineEmbeddingLoss)
1:08:23 – Loss Functions and Loss Functions for Energy Based Models
1:23:18 – Loss Functions for Energy Based Models

#pytorch #python #data-science #machine-learning #artificial-intelligence

PyTorch Activation and Loss Functions
2.25 GEEK