首页 > 解决方案 > python中带有意外阴影的全局变量行为

问题描述

为什么在这段代码中,访问global变量会产生错误?

版本 1:在没有全局变量的情况下按预期工作

import numpy as np
import matplotlib.pyplot as plt

# Initializations
scale = 2000
a, b, c, d = (np.random.randn() for i in range(4))
x = np.linspace(-np.math.pi, np.math.pi, scale)
y1 = np.sin(x)
y2 = a + b * x + c * x ** 2 + d * x ** 3

learning_rate = 1e-6

for i in range(scale):
    y2 = a + b * x + c * x ** 2 + d * x ** 3

    grad_loss = 2.0*(y2-y1)
    
    grad_a = grad_loss.sum()
    grad_b = (grad_loss * x).sum()
    grad_c = (grad_loss * x ** 2).sum()
    grad_d = (grad_loss * x ** 3).sum()
    
    a -= learning_rate * grad_a
    b -= learning_rate * grad_b
    c -= learning_rate * grad_c
    d -= learning_rate * grad_d
        

plt.plot(y1)
plt.plot(y2)
plt.show()

正常工作

但是,对于函数内部的全局变量,行为是不同的

import numpy as np
import matplotlib.pyplot as plt

# Initializations
scale = 2000
a, b, c, d = (np.random.randn() for i in range(4))
x = np.linspace(-np.math.pi, np.math.pi, scale)
y1 = np.sin(x)
y2 = a + b * x + c * x ** 2 + d * x ** 3

def forward():
    y2 = a + b * x + c * x ** 2 + d * x ** 3
    #print(np.square(y2 - y1).sum())

learning_rate = 1e-6

def backward():
    grad_loss = 2.0*(y2-y1)
    
    grad_a = grad_loss.sum()
    grad_b = (grad_loss * x).sum()
    grad_c = (grad_loss * x ** 2).sum()
    grad_d = (grad_loss * x ** 3).sum()
    
    
    global a,b,c,d


    a -= learning_rate * grad_a
    b -= learning_rate * grad_b
    c -= learning_rate * grad_c
    d -= learning_rate * grad_d


for i in range(scale):
    forward()
    backward()
        

plt.plot(y1)
plt.plot(y2)
plt.show()

在此处输入图像描述

标签: python

解决方案


forward函数中 y2 被更新。在您提供的代码中, y2 未声明为全局变量,因此未更新全局变量。添加global y2,结果将是相同的:

import numpy as np

import matplotlib.pyplot as plt

# Initializations scale = 2000 a, b, c, d = (np.random.randn() for i in range(4)) x = np.linspace(-np.math.pi, np.math.pi, scale) y1 = np.sin(x) y2 = a + b * x + c * x ** 2 + d * x ** 3

def forward():
    global y2
    y2 = a + b * x + c * x ** 2 + d * x ** 3
    #print(np.square(y2 - y1).sum())

learning_rate = 1e-6

def backward():
    grad_loss = 2.0*(y2-y1)
    
    grad_a = grad_loss.sum()
    grad_b = (grad_loss * x).sum()
    grad_c = (grad_loss * x ** 2).sum()
    grad_d = (grad_loss * x ** 3).sum()
    
    
    global a,b,c,d


    a -= learning_rate * grad_a
    b -= learning_rate * grad_b
    c -= learning_rate * grad_c
    d -= learning_rate * grad_d


for i in range(scale):
    forward()
    backward()
        

plt.plot(y1) plt.plot(y2) plt.show()

结果


推荐阅读