educative.io

Having a Problem understanding the use or Reshape

Hello.
Thank you for offering this great course.
I am really stuck with the use of the Reshape into the solution of implementing Batch Normalization manually.

I have a problem understanding the offered solution here.

  • Why are the real shapes of Gamma and Beta are like the channel dimension?
  • Why did we reshape the mean like this:

variance = torch.mean((X - mean.reshape((1, C, 1, 1))) ** 2, dim=(0, 2, 3))

Thank you

Hello Mohamed_Sami,

  • Why are the real shapes of Gamma and Beta like the channel dimension?
    The Gamma and Beta are the trainable parameters that are responsible for the linear/affine transformation. These parameters are used for re-scaling (γ) and shifting(β) the vector containing values from the previous operations. That will enable the accurate normalization of each batch. Due to this reason, the real shape of the Gamma and Beta should be like the channel dimension.

  • Why did we reshape the mean like this:

variance = torch.mean((X - mean.reshape((1, C, 1, 1))) ** 2, dim=(0, 2, 3))

Basically in this, we are reshaping the mean in the 4D dimension because we are subtracting it from the input which is of 4D shape to find the variance.

I hope I have answered your query; please let me know if you still have any confusion.

Thank You :blush:

Thank you @Abdul_Mateen for your explanation.

I have fully understood that now

Many regards :blush:

1 Like