gradient descent algorithm

| |

Machine Learning Hypothesis function Cost function and Gradient Descent Algorithm Short Notes

  This is a short scrap notes from the Machine Learning Course taught by Associate Professor,  Andrew Ng,  Stanford University on Coursera.  hypothesis = hØ(x) + Øo + Ø1X Parameters = Øo , Ø1 Our Goal = minimise Ø0 , Ø1 that is the cost function J(Ø0, Ø1) GRADIENT Descent. An algorithm wich minimises the…