問題:
Right now I have a central module in a framework that spawns multiple processes using the Python 2.6 multiprocessing
module .現在我在一個框架中有一箇中央模塊,它使用 Python 2.6 multiprocessing
模塊產生多個進程。 Because it uses multiprocessing
, there is module-level multiprocessing-aware log, LOG = multiprocessing.get_logger()
.因爲它使用multiprocessing
,所以有模塊級多處理感知日誌, LOG = multiprocessing.get_logger()
。 Per the docs , this logger has process-shared locks so that you don't garble things up in sys.stderr
(or whatever filehandle) by having multiple processes writing to it simultaneously.根據docs ,這個記錄器有進程共享鎖,這樣你就不會因爲多個進程同時寫入而在sys.stderr
(或任何文件句柄)中sys.stderr
。
The issue I have now is that the other modules in the framework are not multiprocessing-aware.我現在遇到的問題是框架中的其他模塊不支持多處理。 The way I see it, I need to make all dependencies on this central module use multiprocessing-aware logging.在我看來,我需要使對這個中央模塊的所有依賴都使用多處理感知日誌記錄。 That's annoying within the framework, let alone for all clients of the framework.這在框架內很煩人,更不用說框架的所有客戶端了。 Are there alternatives I'm not thinking of?有沒有我沒有想到的替代方案?
解決方案:
參考一: https://stackoom.com/question/2grU參考二: How should I log while using multiprocessing in Python?