标题:Python多进程库multiprocessing的封装 出处:Felix021 时间:Sun, 30 Nov 2014 22:09:49 +0000 作者:felix021 地址:https://www.felix021.com/blog/read.php?2141 内容: 最近项目上有需要,大概就是有一个list里的东西需要处理,例如一堆文件什么的,于是有一个file_processor——按顺序处理一个文件列表。简单封装了一下multiprocessing这个库,发现用起来很方便,很轻松地就实现了多进程并行处理(进程间无交互): import multiprocess slices = multiprocess.split_list(filelist, 8) #分成8份 processes = map(lambda slice: multiprocess.spawn(file_processor, slice), slices) sys.exit(multiprocess.start_and_join(processes)) multiprocess.py 则是这样的: #!/usr/bin/python #coding: utf-8 import sys from multiprocessing import Process def split_list(data, n_slice, hash_func=lambda i, d: i): #default: sequential slices = [] for i in range(n_slice): slices.append([]) for i, d in enumerate(data): slices[hash_func(i, d) % n_slice].append(d) return slices def spawn(target, *args, **kwargs): return Process(target=target, args=args, kwargs=kwargs) def start_and_join(processes, killall_if_fail=True): for p in processes: p.start() exitcode = 0 for p in processes: p.join() if p.exitcode != 0: exitcode = p.exitcode break if exitcode != 0: for p in processes: if killall_if_fail and p.is_alive(): p.terminate() return exitcode Generated by Bo-blog 2.1.0