Activation Functions
author: Juma Shafara date: "2024-08-12" title: Activation Functions keywords: [Training Two Parameter, Mini-Batch Gradient Decent, Training Two Parameter Mini-Batch Gradient Decent] description: How to apply different Activation functions in Neural Network.

Objective
- How to apply different Activation functions in Neural Network.
Table of Contents
In this lab, you will cover logistic regression by using PyTorch.
Estimated Time Needed: 15 min
We'll need the following libraries
Logistic Function
Create a tensor ranging from -10 to 10:
When you use sequential, you can create a sigmoid object:
Apply the element-wise function Sigmoid with the object:
Plot the results:
For custom modules, call the sigmoid from the torch (nn.functional for the old version), which applies the element-wise sigmoid from the function module and plots the results:
Tanh
When you use sequential, you can create a tanh object:
Call the object and plot it:
For custom modules, call the Tanh object from the torch (nn.functional for the old version), which applies the element-wise sigmoid from the function module and plots the results:
Relu
When you use sequential, you can create a Relu object:
For custom modules, call the relu object from the nn.functional, which applies the element-wise sigmoid from the function module and plots the results:
Compare the activation functions with a tensor in the range (-1, 1)
Double-click here for the solution.