educative.io

https://www.educative.io/courses/beginners-guide-to-deep-learning/N8MoOyLP9WN

Could you correct the below statement or provide explanation ?
The feedforward operation is a multiply and add process which is a dot product in vector algebra.
Visualize this in the illustration below:
3rd Slide or picture states that
Hidden layer weights is a vector of size: ((hidden layer units * input units) + bias units) * input units

There are 2 input units and 2 hidden units. The weights should have vector size of 2 * 2 isn’t it?
Why again it is multiplied with the input units in the above statement? It should be
(hidden layer units * input units) + bias units? Otherwise provide an explanation please

Below is my analysis
The size of the hidden layer weights matrix would be 4.
The formula for the size of the hidden layer weights matrix is:
(hidden layer units * input units)
In this case, there are 2 input units and 2 hidden units, so the size of the hidden layer weights matrix would be:
(2 * 2) = 4

Any inputs to this ?

@Javeria_Tariq could correct or provide explanation please?

Hi @Vikrant !!
For a fully connected neural network (where each neuron in a layer is connected to every neuron in the previous layer), the size of the hidden layer weights can be calculated as follows:

Let:

  • hidden_layer_units be the number of neurons in the hidden layer.
  • input_units be the number of neurons in the previous (input) layer.
  • bias_units be the number of bias terms (usually one per neuron in the hidden layer).

Then, the size of the hidden layer weights (assuming each neuron in the hidden layer has its own bias term) would be:

hidden_layer_weights_size = (hidden_layer_units * input_units) + bias_units

This formula takes into account the weights connecting each input unit to each hidden unit and the bias terms associated with each hidden unit.

For example, if you have a neural network with 100 hidden units and the input layer has 50 units, and each hidden unit has its own bias term, the size of the hidden layer weights would be:

hidden_layer_weights_size = (100 * 50) + 100 = 5100
Thanks for pointing it out. We will update this soon.

1 Like

It has been updated.
Happy Learning :blush:

1 Like