Posts

Showing posts from July, 2022

Loss functions in Deep Learning

Image
While there are a shit ton of concepts related to Deep learning scrambled all around the internet, I thought why not have just one place where one can find all the fundamental concepts needed to set up their own Deep Neural Network (DNN) architecture? This series can be viewed as a reference guide that you can come back to and look at to brush up on everything. In this first part, I will discuss one of the most essential elements of deep learning - the loss function! I call it the "Oxygen of Deep Learning" because, without a loss function, a neural network cannot be trained (so it would just be dead). A loss function also called an objective function or a cost function, shows us "how" bad our neural network's predictions are or quantify our unhappiness with scores (another word for predictions) across the training data. So lower the loss, the better our model is. An abstract formulation can be (on image classification task) as follows - Given an image $