[ad_1]

*Basic Intuition*

Knowing how the things in our daily life are related to each other and how a particular situation is affected by its surrounding scenarios can be of great importance. This information can be used in better decision-making and efficient resolution of problems. To know this functional relationship, we have two types of variables to take into consideration. These are the predictor variables (independent or explanatory variables) and response variables (dependent or explained variables). The *dependent variable* is the main factor you’re trying to understand or predict while an* independent variable* is the factor that you hypothesize has an impact on your dependent variable.

Thus, Regression analysis is an analysis technique that calculates the estimated relationship between a dependent variable and one or more explanatory variables. With regression analysis, you can model the relationship between the chosen variables as well as predict values based on the model.

Regression analysis is primarily used for two conceptually distinct purposes.

- It is widely used for
*prediction*and*forecasting*, where its use has substantial overlap with the field of machine learning. - It can be used to infer causal relationships between the independent and dependent variables. It is especially important when researchers hope to estimate causal relationships using observational data.

A simple classification of regression analysis:

The most common form of regression analysis is* **l**inear regression*, in which one finds the line (or a more complex linear combination) that most closely fits the data according to a specific mathematical criterion. Mathematically, the line representing a simple linear regression is expressed through a basic equation:

**Y = a+ bX + e**

where ‘a’ is the intercept (value of dependent variable Y when the value of independent variable X is zero), ‘b’ is the slope of the line (change in Y divided by change in x), and ‘e’ is the error term.

Multiple regression is a type of analysis that uses more than one predictor variable to predict the dependent variable. In multiple regression, the model simultaneously fits the data using all the predictor variables. This allows the model to account for the interdependencies among the predictor variables. Mathematically, it can be shown as:

**Y = a + b1X1 + b2X2 + . . . + bnXn + e**

Nonlinear regression is a mathematical model that fits an equation to certain data using a generated line. As is the case with a linear regression that uses a straight-line equation (such as Ỵ= c + m x), nonlinear regression shows association using a curve, making it nonlinear in the parameter.

A simple nonlinear regression model is expressed as follows:

## Y = f(X,b) + ϵ

Here b is a vector of parameters and f is the known regression function.

Finally, the goal of these regression models is to minimize the error sum of squares which can be achieved by Ordinary Least Squares (OLS) in linear cases and by iterative processes like gradient descent in non-linear scenarios.

We have a diversity of methods in regression analysis. These constitute various types which will be discussed in the coming sections. These are:

Thank You. Do follow and give some feedback.

For further readings. See:

References:

Applied regression analysis (1998). **Authors: **Norman Richard Draper (Author), Harry Smith (Author)

file:///C:/Users/KIT/Downloads/Introduction%20to%20Linear%20Regression%20Analysis%20(%20PDFDrive%20).pdf

[ad_2]

Source link