Skip to content

Latest commit

 

History

History
15 lines (10 loc) · 756 Bytes

README.md

File metadata and controls

15 lines (10 loc) · 756 Bytes

Implementing a Neural Network from Scratch

The goal of this exercise is to learn step by step the different elements of a simple two layer network by implementing it from scratch. This exercise was completed as a lab assignment in Florida Poly's Machine Learning course in Spring 2020.

Steps:

  • setup the network, namely define the number of layers and units per layer
  • initialize the parameters (w's and b's)
  • Use tanh as the activation function of the hidden layer, and sigmoid for the output layer
  • Compute the cross entropy loss
  • Implement forward and backward propagation

References

The author of the original Jupyter Notebook is Andrew Ng. This Notebook was modified by Dr. Luis Jaimes at Florida Poly for use as a lab assignment.