Skip to main content

LinearRegression

LinearRegression(Simple) 

Hello everyone, you reach in good place to understand the machine learning algorithm called linear regression. we will go through each step to complete this topic and starting from basic i.e simple linear regression where we have one input variable only.

let's understand with example, suppose we have a business and have to decide how much invest to earn certain amount. or what is value of 3bhk flat or what is car value of 5 years old like that.

so, this is generally used to predict the continuous type data.

this is simple and commonly used ML technique or algorithms based on linear function that we have learned in 10th class or 11th. so lets start with equation now.

y=m*x+c (the game is all about this function)

y called independent or output or u can say predicted output/actual output.

x called input variable or independent variable i.e y value based on x value.

m is called slope i.e diff in value of y/diff in value of x

c is called intercept where slope line touch the line in y axis.

here i m taking a value of x =[2,3,4,5,6,7,8,9,10,11,12,13]

and we have y value=[5,6.5,8,9.5,11,12.5,14,15.5,17,18.5,20,21.5]

we have a job to find the value of m and c as per equation so that we can easily capture the value of y when we have value of x.

right, go to the trick to find m and c.

we have two method:

first is OLS(ordinary least square) and second is gradient decent, lets understand 1st one ols technique.

we have our own business and want to predict what is earning incase of particular investment.

so we have investment in x variable and earning in y variable.








lets see the above picture and understand what the calculation is.

step1-calculate the mean of x and y and sum.

meanx=sum(x)/len(x)

meany=sum(y)/len(y)

step2-calculate the variance of x and y from mean.

var_x=x-meanx

var_y=y-meany

step3-multiply the variance of x and y and sum.

var_x*var_y

step4-power2 the variance of x and sum.

(var_x)**2

m=step3/step4

c=meany-(m*meanx)

python implementation:











We have completed this topic and now refer my YouTube channel to watch.

please refer Github for python code in SKlearn library


Multiple Linear Regression:

 In last blog we have already discussed about simple linear regression where we have only one input variable i.e x, that's why called simple linear regression, this blog we will learn about multiple linear regression.

as mentioned in name multiple means we will have more than 2 input variable, i.e x value. 

suppose i want to predict home price with respect to size, no of bedroom, no of bathroom and age. in this scenario we have 4 x values, we will have to implement regression with these 4 variables.

here we will have formula for multiple linear regression is:

y=m1x1+m2x2+m3x3+.........+mnxn+c, so basically we used multiple input to predict the outcome.

same as we use ols method to predict the outcome for multiple linear regression.





Comments

Popular posts from this blog

Feature engineering

 we all know machine learning model do not understand text data, so we need to transform the text data to number. i.e we need to transform the raw data to well prepared data to implement the machine learning model for prediction. the process to convert the raw data to make well prepared data is called feature engineering.

Evaluation metrics for Regression

 Welcome to this post, you fall in very good place and interesting part of machine learning implementation. Evaluation- how we feel with this word, definitely we are going to examine something, i .e out machine learning model. feels very responsible state of mind- going to check and correcting our model. when we are discussing about the machine learning, it divided into 3 parts supervised, unsupervised and reinforcement. we are not going to discuss in detail about this you can use other Resorces to learn or our other post. supervised machine learning is of two types regression and classification. we will learn to evaluate these two separately in this post. Evaluation metrics for Regression: when we have made the regression model where our target value is continuous, we use following method to evaluate the model: MAE: (mean absolute error) MSE: (mean square error) RMSE: (root mean square error) R squared error Adjusted R squared. before we discussed about these techniques for evalua...

Decision Tree

\we are discussion about decision tree which falls under supervised learning algorithm and used for classification and regression both. I Will try to cover to each and every topic regarding this algorithm and write python code to implement the same. before falling to discussion, we need understand this popular technique of machine learning. now the definition is: this tree like structure and a kind of predictive modeling approach. it has tree like structure upside down use to represent decision for decision making. this can handle high dimensional data with high accuracy. this tree also used to predict house price, car value and categorical data as well. the decisoion tree represent root node,terminal node,decision node and branches. sandhyakrishnana decion tree medium