Sunday, January 18, 2026

Linear Regression Models

 



Linear Regression Models

 

Fasten your seatbelts, you are in for a ride! I will take you through Artificial Intelligence step by step, so you can easily grasp the basic concepts! We will start by Linear Regression Models, which is a statistical technique defined as a “type of supervised machine-learning algorithm that learns from the labelled datasets and maps the data points with most optimized linear functions which can be used for prediction on new datasets. It assumes that there is a linear relationship between the input and output, meaning the output changes at a constant rate as the input changes. This relationship is represented by a straight line”.

It is used to make predictions, where there is an independent and dependent variable which are related to each other in a linear way. A simple regression model is represented by the following formula, where Y is the dependent and X is the independent. In this formula, b represents the slope of the line and a represents the intercept (which is the altitude of the line in the model).

y = a + b x

It will be more graphical to explain this through an example. Let´s assume that there is a linear relationship between spending in advertising and sales.

a) Intercept (3):
If advertising spend is zero, expected sales are $3,000.

b) Slope (2):
For every additional $1,000 spent on advertising, sales increase by $2,000.

If the company spends $6,000 on advertising:

Y = 3 + 2 (6) = 15

Predicted Sales = $15,000

Check that this simple regression model assumes a linear relationship between the dependent and independent variables, and with the adjustments to slope and intercept, we can predict how much the sales will increase for every USD spent on advertising. Now this formula will provide an approximation, meaning once executed the real impact on sales by advertising spending will diverge in + or – from the prediction. However, the tendency will be marked as in the following graphic, depicting that even if actual data diverges from the line, the linear relationship between is clear.

 

 

Now the model could also include several variables leading to a multiple linear regression, which involves one dependent variable and multiple independent variables. We will not cover this here but consider the possibility that multiple inputs have an impact on the output as well.

Linear regression models appeared in 1850 by the work of Francis Galton, but higher computational power has made possible the calculation of multiple linear regression simpler and faster. They constitutes the back bone of machine learning, and are widely used in forecasting. Consider the stock market: by utilizing multiple linear regression models several variables of historical data can be considered to predict the future price of a stock. Are you ready to allow linear regression models to predict the future? The road to artificial intelligence has just begun!

 

Linear Regression in Machine learning - GeeksforGeeks


No comments:

Post a Comment

Linear Regression Models

  Linear Regression Models   Fasten your seatbelts, you are in for a ride! I will take you through Artificial Intelligence step by step,...