Gradient Descent and Cost Function

How does Gradient Descent and Cost Function work in ML?


In ML all we have to find minimum weights.



In above image you can see that we find global minima but how we can find it. for this we take small step which is learning rate.you find the tangent at every point and you find the next point. you can see in image --->--> become small and small . Now question is how we know that what learning rate we need? basically it's trial and error method so you try different learning rate and you will find global minima. 

If your learning rate is very big then it may be miss the global minima.

Our linear regression line equation is : y = mx + b 
where  m and B is 

Where 


  
So how you find the step size is given in below image



Where mse is cost function




Gradient Descent and Cost Function in python :

First we import the library 


Now we define gradient_descent function


Now we define X and Y


lets call the function and see the output



OUTPUT : 




You can see at iteration 99 value of m2,b2 and value of cost function is now stable and not vary too much. So you find the global minima.

If you want to visualise this output then its look like : 





 

Comments

Post a Comment

Popular posts from this blog

Multivariate logistic regression in Python

Decision tree for titanic dataset in Python

K Means Cluster Algorithm