今天同事反映一个问题让帮忙看一下:多进程共用一个变量,在一个进程中修改后,在另外的进程中并没有产生修改。
一、错误的实现方式
最初以为是没添加global声明导致修改未生效,但实际操作发现global方式在多进程中也只能读不能写。错误示例代码如下:
import multiprocessing # 声明一个全局变量 share_var = ["start flag"] def sub_process(process_name): # 企图像单个进程那样通过global声明使用全局变量 global share_var share_var.append(process_name) # 但是很可惜,在多进程中这样引用只能读,修改其他进程不会同步改变 for item in share_var: print(f"{process_name}-{item}") pass def main_process(): process_list = [] # 创建进程1 process_name = "process 1" tmp_process = multiprocessing.Process(target=sub_process,args=(process_name,)) process_list.append(tmp_process) # 创建进程2 process_name = "process 2" tmp_process = multiprocessing.Process(target=sub_process, args=(process_name,)) process_list.append(tmp_process) # 启动所有进程 for process in process_list: process.start() for process in process_list: process.join() if __name__ == "__main__": main_process()
执行结果如下,可以看到进程1中的修改未表现在进程2中(不过要注意,和多线程一样,如果运算量再大一点进程1并不一定比进程2先执行):
二、共享普通类型变量实现方法
参考:https://blog.csdn.net/houyanhua1/article/details/78244288
import multiprocessing # 不能将共享变量和共享锁定义成全局变量然后通过global引用那样会报错,只能传过来 def sub_process(process_name,share_var,share_lock): # 获取锁 share_lock.acquire() share_var.append(process_name) # 释放锁 share_lock.release() for item in share_var: print(f"{process_name}-{item}") pass def main_process(): # 单个值声明方式。typecode是进制类型,C写法和Python写法都可以,见下方表格;value是初始值。 # 这种单值形式取值赋值需要通过get()/set()方法进行,不能直接如一般变量那样取值赋值 # share_var = multiprocessing.Manager().Value(typecode, value) # 数组声明方式。typecode是数组变量中的变量类型,sequence是数组初始值 # share_var = multiprocessing.Manager().Array(typecode, sequence) # 字典声明方式 # share_var = multiprocessing.Manager().dict() # 列表声明方式 share_var = multiprocessing.Manager().list() share_var.append("start flag") # 声明一个进程级共享锁 # 不要给多进程传threading.Lock()或者queue.Queue()等使用线程锁的变量,得用其进程级相对应的类 # 不然会报如“TypeError: can't pickle _thread.lock objects”之类的报错 share_lock = multiprocessing.Manager().Lock() process_list = [] process_name = "process 1" tmp_process = multiprocessing.Process(target=sub_process,args=(process_name,share_var,share_lock)) process_list.append(tmp_process) process_name = "process 2" tmp_process = multiprocessing.Process(target=sub_process, args=(process_name,share_var,share_lock)) process_list.append(tmp_process) for process in process_list: process.start() for process in process_list: process.join() if __name__ == "__main__": main_process()
执行结果如下,可以看到进程1中的修改已表现在进程2中(不过要注意,和多线程一样,如果运算量再大一点进程1并不一定比进程2先执行):
typecode如果是数值或单个字符,可为以下类型(注意有引号):
Type Code | C Type | Python Type |
'c' | char | character |
'b' | signed char | int |
'B' | unsigned char | int |
'u' | Py_UNICODE | unicode character |
'h' | signed short | int |
'H' | unsigned short | int |
'i' | signed int | int |
'I' | unsigned int | int |
'l' | signed long | int |
'L' | unsigned long | int |
'f' | float | float |
'd' | double | float |
如果是字符串类型,typecode可为以下第一列形式(注意无引号):
ctypes type | C type | Python type |
c_bool |
_Bool | bool (1) |
char | char | 1-character string |
c_wchar | wchar_t | 1-character unicode string |
c_byte | char | int/long |
c_ubyte | unsigned char | int/long |
c_short | short | int/long |
c_ushort | unsigned short | int/long |
c_int | int | int/long |
c_uint | unsigned in | int/long |
c_long | long | int/long |
c_ulong | unsigned long | int/long |
c_longlong | __int64 or long long | int/long |
c_ulonglong | unsigned __int64 or unsigned long long | int/long |
c_float | float | float |
c_double | double | float |
c_longdouble | long double | float |
c_char_p | char * (NUL terminated) | string or None |
c_wchar_p |
wchar_t * (NUL terminated) | unicode or None |
c_void_p | void * | int/long or None |
三、共享实例化对象实现方法
同事还想共享一个文件对象,然后问上边的方法是不是只能共享字典、列表,没法共享对象。
回头一看,Value和Array中typecode要求是c语言中存在的类型,其他只有dict()和list()方法没有其他方法,所以似乎上边的方法共享实例化对象是不行的。
3.1 共享不需要修改实例化对象实现方法----使用global
但我们前面说过global方式不可以修改,但读还是没问题的;所以对象引用还是可以使用global方式。
import multiprocessing import threading # 实例化一个全局文件对象 file_obj = open("1.txt","a") share_lock = threading.Lock() def sub_process(process_name): global file_obj,share_lock share_lock.acquire() file_obj.writelines(f"{process_name}") share_lock.release() pass def main_process(): process_list = [] # 创建进程1 process_name = "process 1" tmp_process = multiprocessing.Process(target=sub_process,args=(process_name,)) process_list.append(tmp_process) # 创建进程2 process_name = "process 2" tmp_process = multiprocessing.Process(target=sub_process, args=(process_name,)) process_list.append(tmp_process) # 启动所有进程 for process in process_list: process.start() for process in process_list: process.join() if __name__ == "__main__": main_process()
3.2 共享需要修改实例化对象实现方法----使用BaseManager
global方式不能修改变量(如要修改其成员变量),在大多时候也是可以了,但总让人觉得不是一种完美的实现方法。有没有可以修改的实现方法呢,答案是有的,可以使用BaseManager。示例代码如下。
参考:https://blog.csdn.net/jacke121/article/details/82658471
import multiprocessing from multiprocessing.managers import BaseManager import threading # 锁可以通过global也可以在Process中传无所谓 share_lock = threading.Lock() # 定义一个要共享实例化对象的类 class Test(): def __init__(self): self.test_list = ["start flag"] def test_function(self,arg): self.test_list.append(arg) def print_test_list(self): for item in self.test_list: print(f"{item}") def sub_process(process_name,obj): global share_lock share_lock.acquire() obj.test_function(f"{process_name}") share_lock.release() obj.print_test_list() pass def main_process(): # 如果是想注册open方法这样操作 # manager = BaseManager() # # 一定要在start前注册,不然就注册无效 # manager.register('open', open) # manager.start() # obj = manager.open("1.txt","a") # 为了更加直接我们直接以一个Test类的实例化对象来演示 manager = BaseManager() # 一定要在start前注册,不然就注册无效 manager.register('Test', Test) manager.start() obj = manager.Test() process_list = [] # 创建进程1 process_name = "process 1" tmp_process = multiprocessing.Process(target=sub_process,args=(process_name,obj)) process_list.append(tmp_process) # 创建进程2 process_name = "process 2" tmp_process = multiprocessing.Process(target=sub_process, args=(process_name,obj)) process_list.append(tmp_process) # 启动所有进程 for process in process_list: process.start() for process in process_list: process.join() if __name__ == "__main__": main_process()
执行结果如下,可以看到进程1中的修改已表现在进程2中(不过要注意,和多线程一样,如果运算量再大一点进程1并不一定比进程2先执行):
参考:
https://blog.51cto.com/11026142/1874807
https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing