<
>

求解逻辑回归&#8212;-梯度下降

2020-06-28 14:38:03 来源:易采站长站 作者:易采站长站整理


Theta: [[-5.09232519] [ 0.04627476] [ 0.04185042]] - Iter: 109155 - Last cost: 0.38 - Duration: 20.83s

array([[-5.09232519],
[ 0.04627476],
[ 0.04185042]])

[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-722REYOT-1585660078919)(output_25_2.png)]

# 对比不同的梯度下降法
runExpe(orig_data, theta, 1, STOP_ITER, thresh=5000, alpha=0.001)

***Original data - learning rate: 0.001 - Stochastic descent - Stop: 5000 iterations
Theta: [[ nan] [ nan] [ nan]] - Iter: 5000 - Last cost: nan - Duration: 0.22s

array([[ nan],
[ nan],
[ nan]])

[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-cZj8e5zK-1585660078919)(output_26_3.png)]

很不稳定,试试把学习率调小

runExpe(orig_data, theta, 1, STOP_ITER, thresh=15000, alpha=0.000002)

***Original data - learning rate: 2e-06 - Stochastic descent - Stop: 15000 iterations
Theta: [[ nan] [ nan] [ nan]] - Iter: 15000 - Last cost: nan - Duration: 1.12s

array([[ nan],
[ nan],
[ nan]])

[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-WtCPuYJk-1585660078920)(output_28_3.png)]

速度快,但稳定性差,需要很小的学习率

runExpe(orig_data, theta, 16, STOP_ITER, thresh=15000, alpha=0.001)

***Original data - learning rate: 0.001 - Mini-batch (16) descent - Stop: 15000 iterations
Theta: [[-1.03569128] [ 0.02012935] [ 0.00863927]] - Iter: 15000 - Last cost: 0.56 - Duration: 1.16s

array([[-1.03569128],
[ 0.02012935],
[ 0.00863927]])

[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-dHN4HDSu-1585660078920)(output_30_2.png)]

浮动仍然比较大,尝试下对数据进行标准化,将数据按其属性(按列进行)减去其均值,然后除以其方差。最后得到的结果是,对每个属性/每列来说所有数据都聚集在0附近,方差值为1

from sklearn import preprocessing as pp
              
暂时禁止评论

微信扫一扫

易采站长站微信账号