python - 多处理共享内存数组在子进程中不改变大小
问题描述
所以这有点复杂,请多多包涵。我正在继承多处理的 Process 类,如下所示:
import dill
import multiprocessing as mp
class MyProcess(mp.Process):
def __init__(self):
super().__init__()
self.q = mp.Queue()
self.array = mp.Array('i', [0,0,0]) # starts off with a size of 3 items
def run(self):
"""retrieve serialized functions from the queue and call them"""
while True:
f = dill.loads(self.q.get())
f(self.array) # pass the array as the first argument
def my_method(self):
"""change the size of the array and make the values all equal 1"""
# increment the size of the array by 1
self.array = mp.Array('i', [0,0,0,0])
# print here to show the size of the array before the function call
print(len(self.array[:]))
# define a nested function here to serialize and put in the queue
def f(array):
# print the size of the array here during the function call
print(len(array[:]))
# modify the array contents
with array.get_lock():
array[:] = [1,1,1,1]
# serialize the function and put it in the queue
self.q.put(dill.dumps(f))
子进程正在从队列中检索序列化函数并将它们作为进来的函数调用。这些函数填充一个共享数组对象,该对象的大小会发生变化。当我实例化子类并调用“my_method”方法时,我希望数组的大小在函数调用之前增加到 4;但是,正如您从 print 语句中看到的那样,情况并非如此。数组的大小还是3:
>>> p = MyProcess()
>>> p.start()
>>> p.my_method()
4
3
<traceback>
ValueError: Can only assign sequence of same size
有人可以解释为什么会这样吗?
编辑:所以我是个白痴,这已经在这里得到了回答。实例化后,您无法更改共享内存数组的大小;但是,您可以使用由多进程管理器实例生成的列表来存储动态大小的数组。