博客
关于我
用线性回归计算缺失值
阅读量:352 次
发布时间:2019-03-04

本文共 2672 字,大约阅读时间需要 8 分钟。

  • Missing data

    Missing data can grocely be classified into three types:

    1. MCAR(Missing Completely At Random), which means that there is nothing systematic about why some date is missing. That is, there is no relationship between the fact that data is missing and either the observed or unobserved covariates.
    2. MAR(Missing At Random), resembles MCAR because there still is an element of randomness.
    3. MNAR(Missing Not At Random), implies that the fact that fata is missing is directly correlated with the value of the misssing data.
  • How to deal with missing data

    1. Just delete missing entries
    2. Replaceing missing values with the mean or median
    3. Linear Regression

      First, several predictors of the variable with missing values are identified using a correlation matrix. The best predictors are selected and used as independent variables in a regression equation.

      The variable with missing data is used as the dependent variable.

      Second, cases with complete data for the predictor variables are used to generate the regression equation;

      Third, the equation is then used to predict missing values for incomplete cases in an iterative process.

      以上是单变量线性回归

    4. 多元线性回归

      Linear regression has signigicant limits like:

      • It can’t easily match any data set that is non-linear
      • It can only be used to make predictions that fit within the range of the training data set
      • It can only be fit to data sets with a single dependent variables and a single independent variable

      This is where multiple regression comes in. It is specifically designed to create regressions on models with a single dependent variable and multiple independent variables.

      Equation for multiple regpression takes the form:

      y = b 1 ∗ x 1 + b 2 ∗ x 2 + . . . + b n ∗ x n + a y=b_1*x_1+b_2*x_2+...+b_n*x_n+a y=b1x1+b2x2+...+bnxn+a
      b i b_i bi coefficients;

      x i x_i xi independent variables; also called predictor variables

      y i y_i yi dependent vairables; also called criterion variable

      a a a a constant stating the value of the depnedent variable;

      How to fit a multiple regression model ?

      Similarly to minimized the sum of squared errors to find B in the linear regression, we minimize the sum of squared errors to find all the B terms in multiple regression.

      Exactly we use stochastic gradient descent(随机梯度下降).

      How to make sure the model fits the data well ?

      Use the same r 2 r^2 r2 value that was used for linear regression.

      r 2 r^2 r2 which is called the coefficient of determination, states the portion of change in the data set that is predicted by the model. It’s a value ranging from 0 to 1. With 0 stating that the model has no ability to predict the result and 1 stating that the model predicts the result perfectly.

  • References

转载地址:http://pjge.baihongyu.com/

你可能感兴趣的文章
MongoDB学习笔记(8)--索引及优化索引
查看>>
MQTT工作笔记0009---订阅主题和订阅确认
查看>>
ms sql server 2008 sp2更新异常
查看>>
MS UC 2013-0-Prepare Tool
查看>>
msbuild发布web应用程序
查看>>
MSB与LSB
查看>>
MSCRM调用外部JS文件
查看>>
MSCRM调用外部JS文件
查看>>
MSEdgeDriver (Chromium) 不适用于版本 >= 79.0.313 (Canary)
查看>>
MsEdgeTTS开源项目使用教程
查看>>
msf
查看>>
MSSQL数据库查询优化(一)
查看>>
MSSQL日期格式转换函数(使用CONVERT)
查看>>
MSTP多生成树协议(第二课)
查看>>
MSTP是什么?有哪些专有名词?
查看>>
Mstsc 远程桌面链接 And 网络映射
查看>>
Myeclipse常用快捷键
查看>>
MyEclipse用(JDBC)连接SQL出现的问题~
查看>>
myeclipse的新建severlet不见解决方法
查看>>
MyEclipse设置当前行背景颜色、选中单词前景色、背景色
查看>>