Machine Learning: Multivariate Linear Regression

catetan lagi..

n= number of features

x^{(i)}= input (features) of i^{th} training example

x_j^{(i)}n= value of feature j in i^{th} training example

  • Hypothesis function:


  • Gradient descent:

repeat until convergence \{
\theta_j:=\theta_j-\alpha\frac{1}{m}\sum\limits_{i=1}^m (h_\theta(x^{(i)}) - y^{(i)})x_j^{(i)}

(simultaneously update \theta_j for j=0,1,…,n)

  • Feature Scaling
  • Mean Normalization
  • Normal Equation


  • Gradient Descent vs Normal Equation

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.