`
sunbin
  • 浏览: 352669 次
  • 性别: Icon_minigender_1
  • 来自: 深圳
社区版块
存档分类
最新评论

线性回归算法

 
阅读更多

线性回归算法

 

准备数据

-0.4307829,-1.63735562648104 -2.00621178480549 -1.86242597251066 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
-0.1625189,-1.98898046126935 -0.722008756122123 -0.787896192088153 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
-0.1625189,-1.57881887548545 -2.1887840293994 1.36116336875686 -1.02470580167082 -0.522940888712441 -0.863171185425945 0.342627053981254 -0.155348103855541
-0.1625189,-2.16691708463163 -0.807993896938655 -0.787896192088153 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
0.3715636,-0.507874475300631 -0.458834049396776 -0.250631301876899 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
0.7654678,-2.03612849966376 -0.933954647105133 -1.86242597251066 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
0.8544153,-0.557312518810673 -0.208756571683607 -0.787896192088153 0.990146852537193 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
1.2669476,-0.929360463147704 -0.0578991819441687 0.152317365781542 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
1.2669476,-2.28833047634983 -0.0706369432557794 -0.116315079324086 0.80409888772376 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
1.2669476,0.223498042876113 -1.41471935455355 -0.116315079324086 -1.02470580167082 -0.522940888712441 -0.29928234305568 0.342627053981254 0.199211097885341
1.3480731,0.107785900236813 -1.47221551299731 0.420949810887169 -1.02470580167082 -0.522940888712441 -0.863171185425945 0.342627053981254 -0.687186906466865
1.446919,0.162180092313795 -1.32557369901905 0.286633588334355 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
1.4701758,-1.49795329918548 -0.263601072284232 0.823898478545609 0.788388310173035 -0.522940888712441 -0.29928234305568 0.342627053981254 0.199211097885341
1.4929041,0.796247055396743 0.0476559407005752 0.286633588334355 -1.02470580167082 -0.522940888712441 0.394013435896129 -1.04215728919298 -0.864466507337306
1.5581446,-1.62233848461465 -0.843294091975396 -3.07127197548598 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
1.5993876,-0.990720665490831 0.458513517212311 0.823898478545609 1.07379746308195 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
1.6389967,-0.171901281967138 -0.489197399065355 -0.65357996953534 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
1.6956156,-1.60758252338831 -0.590700340358265 -0.65357996953534 -0.619561070667254 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
1.7137979,0.366273918511144 -0.414014962912583 -0.116315079324086 0.232904453212813 -0.522940888712441 0.971228997418125 0.342627053981254 1.26288870310799
1.8000583,-0.710307384579833 0.211731938156277 0.152317365781542 -1.02470580167082 -0.522940888712441 -0.442797990776478 0.342627053981254 1.61744790484887
1.8484548,-0.262791728113881 -1.16708345615721 0.420949810887169 0.0846342590816532 -0.522940888712441 0.163172393491611 0.342627053981254 1.97200710658975
1.8946169,0.899043117369237 -0.590700340358265 0.152317365781542 -1.02470580167082 -0.522940888712441 1.28643254437683 -1.04215728919298 -0.864466507337306
1.9242487,-0.903451690500615 1.07659722048274 0.152317365781542 1.28380453408541 -0.522940888712441 -0.442797990776478 -1.04215728919298 -0.864466507337306
2.008214,-0.0633337899773081 -1.38088970920094 0.958214701098423 0.80409888772376 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
2.0476928,-1.15393789990757 -0.961853075398404 -0.116315079324086 -1.02470580167082 -0.522940888712441 -0.442797990776478 -1.04215728919298 -0.864466507337306
2.1575593,0.0620203721138446 0.0657973885499142 1.22684714620405 -0.468824786336838 -0.522940888712441 1.31421001659859 1.72741139715549 -0.332627704725983
2.1916535,-0.75731027755674 -2.92717970468456 0.018001143228728 -1.02470580167082 -0.522940888712441 -0.863171185425945 0.342627053981254 -0.332627704725983
2.2137539,1.11226993252773 1.06484916245061 0.555266033439982 0.877691038550889 1.89254797819741 1.43890404648442 0.342627053981254 0.376490698755783
2.2772673,-0.468768642850639 -1.43754788774533 -1.05652863719378 0.576050411655607 -0.522940888712441 0.0120483832567209 0.342627053981254 -0.687186906466865
2.2975726,-0.618884859896728 -1.1366360750781 -0.519263746982526 -1.02470580167082 -0.522940888712441 -0.863171185425945 3.11219574032972 1.97200710658975
2.3272777,-0.651431999123483 0.55329161145762 -0.250631301876899 1.11210019001038 -0.522940888712441 -0.179808625688859 -1.04215728919298 -0.864466507337306
2.5217206,0.115499102435224 -0.512233676577595 0.286633588334355 1.13650173283446 -0.522940888712441 -0.179808625688859 0.342627053981254 -0.155348103855541
2.5533438,0.266341329949937 -0.551137885443386 -0.384947524429713 0.354857790686005 -0.522940888712441 -0.863171185425945 0.342627053981254 -0.332627704725983
2.5687881,1.16902610257751 0.855491905752846 2.03274448152093 1.22628985326088 1.89254797819741 2.02833774827712 3.11219574032972 2.68112551007152
2.6567569,-0.218972367124187 0.851192298581141 0.555266033439982 -1.02470580167082 -0.522940888712441 -0.863171185425945 0.342627053981254 0.908329501367106
2.677591,0.263121415733908 1.4142681068416 0.018001143228728 1.35980653053822 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
2.7180005,-0.0704736333296423 1.52000996595417 0.286633588334355 1.39364261119802 -0.522940888712441 -0.863171185425945 0.342627053981254 -0.332627704725983
2.7942279,-0.751957286017338 0.316843561689933 -1.99674219506348 0.911736065044475 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
2.8063861,-0.685277652430997 1.28214038482516 0.823898478545609 0.232904453212813 -0.522940888712441 -0.863171185425945 0.342627053981254 -0.155348103855541
2.8124102,-0.244991501432929 0.51882005949686 -0.384947524429713 0.823246560137838 -0.522940888712441 -0.863171185425945 0.342627053981254 0.553770299626224
2.8419982,-0.75731027755674 2.09041984898851 1.22684714620405 1.53428167116843 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
2.8535925,1.20962937075363 -0.242882661178889 1.09253092365124 -1.02470580167082 -0.522940888712441 1.24263233939889 3.11219574032972 2.50384590920108
2.9204698,0.570886990493502 0.58243883987948 0.555266033439982 1.16006887775962 -0.522940888712441 1.07357183940747 0.342627053981254 1.61744790484887
2.9626924,0.719758684343624 0.984970304132004 1.09253092365124 1.52137230773457 -0.522940888712441 -0.179808625688859 0.342627053981254 -0.509907305596424
2.9626924,-1.52406140158064 1.81975700990333 0.689582255992796 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
2.9729753,-0.132431544081234 2.68769877553723 1.09253092365124 1.53428167116843 -0.522940888712441 -0.442797990776478 0.342627053981254 -0.687186906466865
3.0130809,0.436161292804989 -0.0834447307428255 -0.519263746982526 -1.02470580167082 1.89254797819741 1.07357183940747 0.342627053981254 1.26288870310799
3.0373539,-0.161195191984091 -0.671900359186746 1.7641120364153 1.13650173283446 -0.522940888712441 -0.863171185425945 0.342627053981254 0.0219314970149
3.2752562,1.39927182372944 0.513852869452676 0.689582255992796 -1.02470580167082 1.89254797819741 1.49394503405693 0.342627053981254 -0.155348103855541
3.3375474,1.51967002306341 -0.852203755696565 0.555266033439982 -0.104527297798983 1.89254797819741 1.85927724828569 0.342627053981254 0.908329501367106
3.3928291,0.560725834706224 1.87867703391426 1.09253092365124 1.39364261119802 -0.522940888712441 0.486423065822545 0.342627053981254 1.26288870310799
3.4355988,1.00765532502814 1.69426310090641 1.89842825896812 1.53428167116843 -0.522940888712441 -0.863171185425945 0.342627053981254 -0.509907305596424
3.4578927,1.10152996153577 -0.10927271844907 0.689582255992796 -1.02470580167082 1.89254797819741 1.97630171771485 0.342627053981254 1.61744790484887
3.5160131,0.100001934217311 -1.30380956369388 0.286633588334355 0.316555063757567 -0.522940888712441 0.28786643052924 0.342627053981254 0.553770299626224
3.5307626,0.987291634724086 -0.36279314978779 -0.922212414640967 0.232904453212813 -0.522940888712441 1.79270085261407 0.342627053981254 1.26288870310799
3.5652984,1.07158528137575 0.606453149641961 1.7641120364153 -0.432854616994416 1.89254797819741 0.528504607720369 0.342627053981254 0.199211097885341
3.5876769,0.180156323255198 0.188987436375017 -0.519263746982526 1.09956763075594 -0.522940888712441 0.708239632330506 0.342627053981254 0.199211097885341
3.6309855,1.65687973755377 -0.256675483533719 0.018001143228728 -1.02470580167082 1.89254797819741 1.79270085261407 0.342627053981254 1.26288870310799
3.6800909,0.5720085322365 0.239854450210939 -0.787896192088153 1.0605418233138 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
3.7123518,0.323806133438225 -0.606717660886078 -0.250631301876899 -1.02470580167082 1.89254797819741 0.342907418101747 0.342627053981254 0.199211097885341
3.9843437,1.23668206715898 2.54220539083611 0.152317365781542 -1.02470580167082 1.89254797819741 1.89037692416194 0.342627053981254 1.26288870310799
3.993603,0.180156323255198 0.154448192444669 1.62979581386249 0.576050411655607 1.89254797819741 0.708239632330506 0.342627053981254 1.79472750571931
4.029806,1.60906277046565 1.10378605019827 0.555266033439982 -1.02470580167082 -0.522940888712441 -0.863171185425945 -1.04215728919298 -0.864466507337306
4.1295508,1.0036214996026 0.113496885050331 -0.384947524429713 0.860016436332751 1.89254797819741 -0.863171185425945 0.342627053981254 -0.332627704725983
4.3851468,1.25591974271076 0.577607033774471 0.555266033439982 -1.02470580167082 1.89254797819741 1.07357183940747 0.342627053981254 1.26288870310799
4.6844434,2.09650591351268 0.625488598331018 -2.66832330782754 -1.02470580167082 1.89254797819741 1.67954222367555 0.342627053981254 0.553770299626224
5.477509,1.30028987435881 0.338383613253713 0.555266033439982 1.00481276295349 1.89254797819741 1.24263233939889 0.342627053981254 1.97200710658975

 

 

python实现:

import pandas as pd
import numpy as np
# 读取文件
dataset=pd.read_csv("../../lpsa.data")
#
# 切片数据
#
Y = dataset.iloc[ :, :1].values
x_temp = dataset.iloc[ : , 1:2].values
X = []
# 字符串切分后转浮点数
for lines in x_temp:
    a=lines[0].split(" ")
    X.append(np.float64(a))
# list转array
X=np.array(X)

# 特征缩放
print(X[:2,:])
from sklearn.preprocessing import StandardScaler
sc_X = StandardScaler()
X= sc_X.fit_transform(X)
Y = sc_X.fit_transform(Y)
print("========")
print(X[:2,:])


#切分训练集与测试集数据
from sklearn.model_selection import train_test_split
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size = 0.2, random_state = 0)
print(np.shape(X_test))


# 训练数据模型
from sklearn.linear_model import LinearRegression
regressor = LinearRegression()
regressor.fit(X_train, Y_train)
# 预测模型,并对比预测值
Y_pred=regressor.predict(X_test )
print("prediction" + "\t" + "label")
for i in range(len(Y_pred)):
    print(Y_pred[i] ,"\t",Y_test[i])
    
# 保存训练后的数据模型
import os    
from sklearn.externals import joblib
os.chdir("../../../model_save")
joblib.dump(regressor, "train_model.m")
regressor = joblib.load("train_model.m")


# 线性回归分析以及评价指标
from sklearn.metrics import mean_squared_error, r2_score
# The mean squared error  
# 均方差 越小越好
# 查看残差平方的均值(mean square error,MSE) 
print("Mean squared error: %.2f"
      % mean_squared_error(Y_test, Y_pred))
 
# Explained variance score: 1 is perfect prediction 
#  R2 决定系数(拟合优度)
# 模型越好:r2→1
# 模型越差:r2→0
print('Variance score: %.2f' % r2_score(Y_test, Y_pred))


# 画图
import matplotlib.pyplot as plt
# Plot outputs
plt.scatter(Y_test, Y_pred,  color='black')
plt.plot(Y_test, Y_pred, color='blue', linewidth=3)
 
# plt.xticks(())
# plt.yticks(())
 
plt.show()

 

 spark实现:

package com.sunbin

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import scala.tools.nsc.interpreter.Logger

import org.apache.spark.mllib.regression.LinearRegressionWithSGD
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.linalg.Vectors
import org.apache.log4j.{ Level, Logger }

object LinearRegression {
  def main(args: Array[String]): Unit = {

    val conf = new SparkConf().setMaster("local[2]").setAppName("linear");
    val sc = new SparkContext(conf)
    Logger.getRootLogger.setLevel(Level.WARN)
    val data_path1 = "lpsa.data"
    val data = sc.textFile(data_path1, 2)
    val example = data.map(line => {
      val parts = line.split(",")
      val parts1 = parts(1).split(" ").map(_.toDouble)
      LabeledPoint(parts(0).toDouble, Vectors.dense(parts1))
    }).cache()

    //区样本数据,测试集0.2 训练集0.8
    val train2TestData = example.randomSplit(Array(0.8, 0.2), 1L)

    /*
     *  迭代次数
     *  训练一个多元线性回归模型收敛(停止迭代)条件:
     *  	1、error值小于用户指定的error值
     *  	2、达到一定的迭代次数
     */
    val numIterations = 100
    //在每次迭代的过程中 梯度下降算法的下降步长大小    0.1 0.2 0.3 0.4
    val stepSize = 1
    val miniBatchFraction = 1
    val lrs = new LinearRegressionWithSGD()
    //让训练出来的模型有w0参数,就是有截距,可大幅度提高预测结果
    lrs.setIntercept(false)
    //设置步长
    lrs.optimizer.setStepSize(stepSize)
    //设置迭代次数
    lrs.optimizer.setNumIterations(numIterations)
    //每一次下山后,是否计算所有样本的误差值,1代表所有样本,默认就是1.0
    lrs.optimizer.setMiniBatchFraction(miniBatchFraction)
    /**
     *    训练模型,如果不设置截距,两者效果一致
     */
    //    val model = LinearRegressionWithSGD.train(train2TestData(0), numIterations,stepSize,miniBatchFraction)
    val model = lrs.run(train2TestData(0))

    println(model.weights)
    println(model.intercept)
    //测试集数据
    val predicton = model.predict(train2TestData(1).map(_.features))
    predicton.foreach(println)

    val predictonandLabel = predicton.zip(train2TestData(1).map(_.label))

    println("===========")
    predictonandLabel.foreach(println)

    val print_predict = predictonandLabel.take(20)
    println("prediction" + "\t" + "label")
    for (i <- 0 to print_predict.length - 1) {
      println(print_predict(i)._1 + "\t" + print_predict(i)._2)
    }

    // 计算测试集平均误差
    val loss = predictonandLabel.map {
      case (p, v) =>
        val err = p - v
        Math.abs(err)
    }.reduce(_ + _)
    val error = loss / train2TestData(1).count
    println("Test RMSE = " + error)
    // 模型保存
    //    val ModelPath = "model"
    //    model.save(sc, ModelPath)
    //    val sameModel = LinearRegressionModel.load(sc, ModelPath)
    sc.stop()
  }
}

 

分享到:
评论

相关推荐

    线性回归算法代码.zip

    本资料包含的“线性回归算法代码.zip”提供了实现线性回归算法的代码示例,包括“问题1.py”和“问题2.py”。 首先,我们来深入理解线性回归的基本概念。线性回归模型通常表示为 y = wx + b,其中y是因变量(目标...

    线性回归算法小组报告ppt

    线性回归算法小组报告ppt 在本文中,我们将讨论线性回归算法的主要概念和技术。线性回归是一种监督学习算法,用于预测连续型变量的值。该算法的主要目标是找到一条最佳的直线,使其能够最好地拟合数据点。 首先,...

    机器学习-上海大学-线性回归算法

    实验介绍 ...本实验介绍线性回归算法,并通过小实验简单认识一下线性回归算法 。 实验1:用线性回归找到最佳拟合直线 实验2:局部加权线性回归找到最佳拟合直线 实验3:使用scikit-learn实现线性回归算法

    基于python的线性回归算法设计与实现

    在Python中,我们可以使用多种库来实现线性回归算法,如NumPy、Pandas、Scikit-Learn等。这篇压缩包可能包含了一个详细的教程或项目,指导读者如何在Python环境下设计和实现线性回归。 首先,我们需要了解线性回归...

    python线性回归算法

    使用python实现的线性回归算法,拟合一条直线并通过直线预测值。

    C#多元线性回归算法

    多元线性回归是一种统计分析方法,用于预测一个或多个自变量与因变量之间的关系。在C#编程语言中实现多元线性回归可以帮助开发者构建预测模型,尤其在数据分析、机器学习和科学计算等领域有着广泛的应用。下面我们将...

    数据挖掘线性回归算法简介.pdf

    数据挖掘线性回归算法简介 数据挖掘线性回归算法是监督学习中的一种简单却强大的算法。它的主要思想是通过找到一组合适的模型参数,使得训练集中所有的训练样本的总误差最小。 在数学上,回归是指给定一个点集,...

    xianxinghuigui_线性回归_线性回归算法_

    在标题“xianxinghuigui_线性回归_线性回归算法_”中,我们可以看出讨论的主题是线性回归及其算法。 线性回归的基本假设是,自变量(x)和因变量(y)之间存在线性关系,用公式表示为 y = w'x + e,其中w'是权重...

    线性回归算法c语言实现

    在这个场景中,我们讨论的是使用C语言来实现简单线性回归算法。C语言是一种底层、高效且灵活的编程语言,适合进行数值计算和数据处理。 简单线性回归模型可以表示为:Y = a + bX + ε,其中Y是因变量,X是自变量,a...

    线性回归分析Delphi源码

    在本案例中,"线性回归分析Delphi源码"指的是使用Delphi编程语言实现的线性回归算法的源代码。Delphi是一款强大的Object Pascal集成开发环境,常用于创建桌面应用程序。 线性回归的基本思想是找到一条直线(在多...

    线性回归算法介绍

    machine learning,Numpy:科学计算库,Pandas:数据分析处理库

    rust-使用rust开发的机器学习算法-k-means+线性回归算法实现.zip

    2. 算法实现模块:分别实现了k-means和线性回归算法的核心逻辑。 3. 距离计算模块:可能包含欧氏距离、曼哈顿距离等距离度量的实现。 4. 可视化模块(可选):可能使用了如plotly-rust或gnuplot库来展示聚类或回归...

    Python实现的简单线性回归算法实例分析

    ### Python 实现简单线性回归算法的实例分析 #### 知识点概述 本文通过一个具体的实例来介绍如何使用Python实现简单线性回归算法。线性回归是一种统计学中的预测模型,它假设自变量和因变量之间存在线性关系。在...

    线性回归预测:利用MATLAB实现了线性回归算法并对给定的数据做出预测

    在本项目中,我们使用MATLAB这一强大的数学计算软件来实现线性回归算法,以便对给定的数据进行预测。 MATLAB是MathWorks公司开发的一种高级编程环境,它支持数值计算、符号计算以及数据可视化等多种功能,对于处理...

    线性回归算法初探详解文档

    线性回归是机器学习中的一种基础且重要的监督学习算法,主要用来解决连续变量的预测问题。...通过理解和掌握线性回归算法,我们可以为这些领域的复杂问题构建有效的学习模型,从而提高预测和决策的准确性。

    机器学习-线性回归算法

    在这个名为“机器学习-线性回归算法”的压缩包中,你将找到一系列资源,帮助你理解并实现线性回归。 首先,线性回归是一种统计学方法,用于分析两个或多个变量之间的关系。在最简单的形式中,它假设因变量(要预测...

    Python实现的线性回归算法示例【附csv文件下载】

    在本文中,我们将深入探讨如何使用Python来实现线性回归算法,这是数据分析和机器学习领域中的一个基础且重要的工具。线性回归是一种统计学方法,用于建立因变量(目标变量)与一个或多个自变量(特征)之间的线性...

    深度剖析线性回归算法及其应用解析

    本文详细地阐述了线性回归算法的基本理论及其两种主要方法——最小二乘法和最大似然法的应用过程。进一步探讨了一些关键的技术细节点以及如何理解和解决偏差与方差问题。通过对各部分内容进行详尽讲解,帮助读者加深...

Global site tag (gtag.js) - Google Analytics