educative.io

https://www.educative.io/courses/beginners-guide-to-deep-learning/B8gXqogEmNW


Could you check the back propagation code highlighted in the screen shot please? The question statement states the following :
:memo: 1. Apply the sigmoid activation function to the net hidden layer outputs respectively.
:memo: 2. Apply the softmax activation function to the net output of the final layer.
In this snippet while calculating DW3 and db3, slope i.e., derivative of SoftMax is not taken into the account, similarly while calculating dh2 i.e. dot product at the hidden layer2 its only l3_error i.e., slope aka derivative of SoftMax function is missing

Could you check the code till it propagates to first layer for such issues please? @Javeria_Tariq please assist. Thanks

Hi Vikrant,

Thank you for reaching out. In response to your query, we have made some updates to the content of the challenge. We have mentioned the loss function, which was previously missing. We have added a detailed hint in the code widget for the computation of the gradient of the cross-entropy loss function w.r.t. the weights of the third layer. In the hint, you can see how the derivative of the softmax is utilised. The solution is correct with respect to the cross entropy loss function. If you have further questions, please do let us know.

Hi @Sami_Muzzamil I’ve tagged in my other post i.e., BP : https://www.educative.io/courses/beginners-guide-to-deep-learning/q2615KlEy67. Could you post your inputs please?


Course: Prompt Engineering: Building a Professional Portfolio - Learn Interactively
Lesson: Introduction to Generative AI