educative.io

Loss computation gradient descent

# computing predictions of all 80 data points present in the training set

print(x.shape, bs.shape)
all_predictions = np.apply_along_axis(
    func1d=lambda x: bs + ws * x, 
    axis=1, 
    arr=x_train,
)
print(all_predictions.shape)

I don’t understand what is x and why is its shape (100,1). The training set is only 80 samples.

Hi @Vishal_Ahuja1,

It is mentioned in Data Generation - Deep Learning with PyTorch Step-by-Step: Part I - Fundamentals lesson, that there are random 100 features (x) generated randomly while the training set is 80 samples (x_train).
The above lambda function is used to compute the predictions of all 80 samples (arr=x_train).