educative.io

Wrong Calc of cross-entropy loss or misspelling?

Hi guys could you explain to me how we got 1.82220 by using this formula -p1*log(p1) - p2*log(p2) - p3log(p3) and hint value -log(0.7) - log(0.7) - log(0.33)?

If I pass it to the python prompt or calc I will get 0.79128998009.


Course: https://www.educative.io/collection/5184083498893312/5582183480688640
Lesson: https://www.educative.io/collection/page/5184083498893312/5582183480688640/4556318946361344

Hey @fesswood, you mentioned the correct hint value -log(0.7) - log(0.7) - log(0.33) and we can evaluate it in Python as follows:

import math
print(- math.log(0.7) - math.log(0.7) - math.log(0.33)) // 1.8220

We can also compute this expression on calc as follows:

- ln(0.7) - ln(0.7) - ln(0.33) // - ln(0.1617) = 1.8220

Happy learning at Educative.

1 Like

Hi @Ali_Sultan , I think it’s not clear here to calculate the Categorical cross-entropy:

  1. Usually we use ‘log’ with base 2 to calculate the loss, if we do not set the base for log in Python, it’s default value is ‘e’
  2. The final loss should be divided by the sample number. The average value is the one we use to do back-prop, not the sum.
    Thanks.
    Reference: Cross-entropy - Wikipedia

Course: Machine Learning System Design - Learn Interactively
Lesson: Training Pipeline - Machine Learning System Design