python - python numpy上的梯度下降函数
问题描述
def gradientDescent(X,y,theta,alpha,num_iters):
print(X.shape,y.shape,theta.shape)
m = len(y)
for iter in range(num_iters):
hypothesis = np.dot(X,theta)
loss = hypothesis - y
print("loss {}".format(loss[0]))
gradient = np.dot(X.transpose(),loss)/m
theta = theta - alpha*gradient
return(theta)
我已经打印了 X、y、theta 和损失的形状以进行澄清,alpha 输入 = 0.01,num_iters 输入 = 150。结果在第 6 步之后出现分歧,如下所示:
(97, 2) (97, 1) (2, 1)
loss [-17.592]
loss [-13.5419506]
loss [-12.82427147]
loss [-12.69896095]
loss [-12.67894766]
loss [-12.67764826]
loss [-12.67967143]
loss [-12.68228117]
loss [-12.68499113]
loss [-12.68771485]
loss [-12.69043697]
loss [-12.69315478]
...
loss [-13.01638377]
loss [-13.01851416]
loss [-13.0206407]
loss [-13.02276341]
loss [-13.0248823]
theta = [[-0.86287834]
[ 0.88834569]]```
theta should have been [[-3.6303]
[1.1664]])
解决方案
推荐阅读
- rest - 正确使用 204 HTTP 状态码
- excel - Excel 条件格式 - 忽略第一个单元格为空白的行
- django - Django 无法覆盖表单字段小部件
- python - 如何区分字符串和字母数字?
- javascript - 这个“_.startsWith”来自哪里?
- javascript - 第二次重定向后 PHP 会话失去价值
- gensim - 如何为 gensim 编译 cython 模块?
- xcode - 使用 Xcode 11 构建项目时 IPA 处理失败
- php - 学说单元测试 created_at 字段失败
- java - 使用带有同步块的 ConcurrentHashMap 的 Java 并发