Ajai Chemmanam
𝐅𝐨𝐫𝐰𝐚𝐫𝐝 𝐕𝐬 𝐁𝐚𝐜𝐤𝐰𝐚𝐫𝐝 𝐏𝐫𝐨𝐩𝐚𝐠𝐚𝐭𝐢𝐨𝐧
Date: 2022-04-03
Written by Ajai Chemmanam

𝐅𝐨𝐫𝐰𝐚𝐫𝐝 𝐕𝐬 𝐁𝐚𝐜𝐤𝐰𝐚𝐫𝐝 𝐏𝐫𝐨𝐩𝐚𝐠𝐚𝐭𝐢𝐨𝐧

𝗙𝗼𝗿𝘄𝗮𝗿𝗱 𝗣𝗿𝗼𝗽𝗮𝗴𝗮𝘁𝗶𝗼𝗻

  • The process of going from Input Layer to Output Layer to adjust or correct the weights is called Forward Propagation.

  • It calculates the output vector from the input vector from which the loss can be calculated. The weights in the hidden layer are not adjusted in this case.

𝐁𝐚𝐜𝐤𝐰𝐚𝐫𝐝 𝐏𝐫𝐨𝐩𝐚𝐠𝐚𝐭𝐢𝐨𝐧

  • The process of going backwards i.e, from the Output Layer to the Input Layer is called Backward Propagation.

  • This method is preferred for adjusting the weights of hidden layers to reach minimum loss.