Initialization with Same Weights
Objective for this Notebook 1. Learn hw to Define the Neural Network with Same Weights Initialization define Criterion Function, Optimizer, and Train the Model
2.Define the Neural Network with defult Weights Initialization define Criterion Function, Optimizer
3. Train the Model
1. Learn hw to Define the Neural Network with Same Weights Initialization define Criterion Function, Optimizer, and Train the Model
2.Define the Neural Network with defult Weights Initialization define Criterion Function, Optimizer
3. Train the Model
Table of Contents
In this lab, we will see the problem of initializing the weights with the same value. We will see that even for a simple network, our model will not train properly. .
- Neural Network Module and Training Function
- Make Some Data
- Define the Neural Network with Same Weights Initialization define Criterion Function, Optimizer, and Train the Model
- Define the Neural Network with defult Weights Initialization define Criterion Function, Optimizer, and Train the Model
Estimated Time Needed: 25 min
Preparation
We'll need the following libraries
Used for plotting the model
Neural Network Module and Training Function
Define the activations and the output of the first linear layer as an attribute. Note that this is not good practice.
# Define the class Net
class Net(nn.Module):
# Constructor
def __init__(self, D_in, H, D_out):
super(Net, self).__init__()
# hidden layer
self.linear1 = nn.Linear(D_in, H)
self.linear2 = nn.Linear(H, D_out)
# Define the first linear layer as an attribute, this is not good practice
self.a1 = None
self.l1 = None
self.l2=None
# Prediction
def forward(self, x):
self.l1 = self.linear1(x)
self.a1 = sigmoid(self.l1)
self.l2=self.linear2(self.a1)
yhat = sigmoid(self.linear2(self.a1))
return yhat
Define the training function:
# Define the training function
def train(Y, X, model, optimizer, criterion, epochs=1000):
cost = []
total=0
for epoch in range(epochs):
total=0
for y, x in zip(Y, X):
yhat = model(x)
loss = criterion(yhat, y)
loss.backward()
optimizer.step()
optimizer.zero_grad()
#cumulative loss
total+=loss.item()
cost.append(total)
if epoch % 300 == 0:
PlotStuff(X, Y, model, epoch, leg=True)
plt.show()
model(X)
plt.scatter(model.a1.detach().numpy()[:, 0], model.a1.detach().numpy()[:, 1], c=Y.numpy().reshape(-1))
plt.title('activations')
plt.show()
return cost
Make Some Data
Define the Neural Network with Same Weights Initialization define, Criterion Function, Optimizer and Train the Model
Create the Cross-Entropy loss function:
Define the Neural Network
This is the PyTorch default installation
Same Weights Initialization with all ones for weights and zeros for the bias.
Optimizer, and Train the Model:
By examining the output of the paramters all thought they have changed they are identical.
Define the Neural Network, Criterion Function, Optimizer and Train the Model
Repeat the previous steps above by using the MSE cost or total loss:
Double-click here for the solution.