首页 > 解决方案 > 具有多处理功能的 pyserial 给了我一个 ctype 错误

问题描述

嗨,我正在尝试编写一个模块,让我通过pyserial. 我必须能够与我的主脚本并行读取数据。在 stackoverflow 用户的帮助下,我有了程序的基本和工作骨架,但是当我尝试添加我创建的使用 pyserial (查找端口、速度等)的类时,收到以下错误:

File "<ipython-input-1-830fa23bc600>", line 1, in <module>
    runfile('C:.../pythonInterface1/Main.py', wdir='C:/Users/Daniel.000/Desktop/Daniel/Python/pythonInterface1')

  File "C:...\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 827, in runfile
    execfile(filename, namespace)

  File "C:...\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 110, in execfile
    exec(compile(f.read(), filename, 'exec'), namespace)

  File "C:/Users/Daniel.000/Desktop/Daniel/Python/pythonInterface1/Main.py", line 39, in <module>
    p.start()

  File "C:...\Anaconda3\lib\multiprocessing\process.py", line 112, in start
    self._popen = self._Popen(self)

  File "C:...\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)

  File "C:...\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen
    return Popen(process_obj)

  File "C:...\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
    reduction.dump(process_obj, to_child)

  File "C:...\Anaconda3\lib\multiprocessing\reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)

ValueError: ctypes objects containing pointers cannot be pickled

这是我用来调用类的代码SerialConnection.py

import multiprocessing 
from time import sleep
from operator import methodcaller

from SerialConnection import SerialConnection as SC

class Spawn:
    def __init__(self, _number, _max):
        self._number = _number
        self._max = _max
        # Don't call update here

    def request(self, x):
        print("{} was requested.".format(x))

    def update(self):
        while True:
            print("Spawned {} of {}".format(self._number, self._max))
            sleep(2)

if __name__ == '__main__':
    '''
    spawn = Spawn(1, 1)  # Create the object as normal
    p = multiprocessing.Process(target=methodcaller("update"), args=(spawn,)) # Run the loop in the process
    p.start()
    while True:
        sleep(1.5)
        spawn.request(2)  # Now you can reference the "spawn"
    '''
    device = SC()
    print(device.Port)
    print(device.Baud)
    print(device.ID)
    print(device.Error)
    print(device.EMsg)
    p = multiprocessing.Process(target=methodcaller("ReadData"), args=(device,)) # Run the loop in the process
    p.start()
    while True:
        sleep(1.5)
        device.SendData('0003')

我做错了什么让这门课给我带来问题?一起使用和多处理是否有某种形式的限制pyserial?我知道可以,但我不明白怎么做...

这是我从 python 得到的回溯

Traceback (most recent call last):   File "C:...\Python\pythonInterface1\Main.py", line 45, in <module>
    p.start()

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\context.py", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\context.py", line 322, in _Popen
    return Popen(process_obj)

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
    reduction.dump(process_obj, to_child)

  File "C:...\AppData\Local\Programs\Python\Python36-32\lib\multiprocessing\reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj) ValueError: ctypes objects containing pointers cannot be pickled

标签: pythonpython-3.xmultiprocessingpicklepyserial

解决方案


您正在尝试将SerialConnection实例作为参数传递给另一个进程。因为那个 python 必须首先序列化(pickle)对象,而对象是不可能的SerialConnection

正如Rob Streeting 的回答中所说,一个可能的解决方案是允许SerialConnection使用调用时发生的 fork 将对象复制到另一个进程的内存multiprocessing.Process.start,但这在 Windows 上不起作用,因为它不使用 fork。

在代码中实现并行性的更简单、跨平台和更有效的方法是使用线程而不是进程。对代码的更改很少:

import threading
p = threading.Thread(target=methodcaller("ReadData"), args=(device,))

推荐阅读