博客
关于我
用线性回归计算缺失值
阅读量:352 次
发布时间:2019-03-04

本文共 2672 字,大约阅读时间需要 8 分钟。

  • Missing data

    Missing data can grocely be classified into three types:

    1. MCAR(Missing Completely At Random), which means that there is nothing systematic about why some date is missing. That is, there is no relationship between the fact that data is missing and either the observed or unobserved covariates.
    2. MAR(Missing At Random), resembles MCAR because there still is an element of randomness.
    3. MNAR(Missing Not At Random), implies that the fact that fata is missing is directly correlated with the value of the misssing data.
  • How to deal with missing data

    1. Just delete missing entries
    2. Replaceing missing values with the mean or median
    3. Linear Regression

      First, several predictors of the variable with missing values are identified using a correlation matrix. The best predictors are selected and used as independent variables in a regression equation.

      The variable with missing data is used as the dependent variable.

      Second, cases with complete data for the predictor variables are used to generate the regression equation;

      Third, the equation is then used to predict missing values for incomplete cases in an iterative process.

      以上是单变量线性回归

    4. 多元线性回归

      Linear regression has signigicant limits like:

      • It can’t easily match any data set that is non-linear
      • It can only be used to make predictions that fit within the range of the training data set
      • It can only be fit to data sets with a single dependent variables and a single independent variable

      This is where multiple regression comes in. It is specifically designed to create regressions on models with a single dependent variable and multiple independent variables.

      Equation for multiple regpression takes the form:

      y = b 1 ∗ x 1 + b 2 ∗ x 2 + . . . + b n ∗ x n + a y=b_1*x_1+b_2*x_2+...+b_n*x_n+a y=b1x1+b2x2+...+bnxn+a
      b i b_i bi coefficients;

      x i x_i xi independent variables; also called predictor variables

      y i y_i yi dependent vairables; also called criterion variable

      a a a a constant stating the value of the depnedent variable;

      How to fit a multiple regression model ?

      Similarly to minimized the sum of squared errors to find B in the linear regression, we minimize the sum of squared errors to find all the B terms in multiple regression.

      Exactly we use stochastic gradient descent(随机梯度下降).

      How to make sure the model fits the data well ?

      Use the same r 2 r^2 r2 value that was used for linear regression.

      r 2 r^2 r2 which is called the coefficient of determination, states the portion of change in the data set that is predicted by the model. It’s a value ranging from 0 to 1. With 0 stating that the model has no ability to predict the result and 1 stating that the model predicts the result perfectly.

  • References

转载地址:http://pjge.baihongyu.com/

你可能感兴趣的文章
NDK编译错误expected specifier-qualifier-list before...
查看>>
Neat Stuff to Do in List Controls Using Custom Draw
查看>>
Necurs僵尸网络攻击美国金融机构 利用Trickbot银行木马窃取账户信息和欺诈
查看>>
Needle in a haystack: efficient storage of billions of photos 【转】
查看>>
NeHe OpenGL教程 07 纹理过滤、应用光照
查看>>
NeHe OpenGL教程 第四十四课:3D光晕
查看>>
Neighbor2Neighbor 开源项目教程
查看>>
neo4j图形数据库Java应用
查看>>
Neo4j图数据库_web页面关闭登录实现免登陆访问_常用的cypher语句_删除_查询_创建关系图谱---Neo4j图数据库工作笔记0013
查看>>