当两个 python 脚本访问单个 pickle 文件时,如何避免读取错误?
How can I avoid read errors when two python scripts access a single pickle file?
我需要在从两个不同来源启动的两个 Python 脚本之间传递数据。我看到我的记录器出现读取错误。有没有更好的方法?
程序 A: 大约每分钟定期写入 pickle 文件。
def cacheData(filepath, data):
#create a cPickle file to cache the current data
try:
outFile = open(filepath,'wb')
#cPickle the current data to a file
cPickle.dump(data, outFile)
outFile.close()
except Exception, e:
logger.warn("Error creating cache file")
程序B:是由用户启动的已编译可执行文件。它读取 pickle 文件以启动一些代码。
def readCachedObj(filepath):
#read cPickle file and return data as object
try:
inFile = open(filepath,'rb')
cache = cPickle.load(inFile)
inFile.close()
return cache
except Exception, e:
logger.warn("Error reading cached data cPickle")
更新 1
def replace(src, dst):
win32api.MoveFileEx(src, dst, win32con.MOVEFILE_REPLACE_EXISTING)
def cacheData(filepath, data):
#create a cPickle file to cache the current data
try:
tmpfile = str(uuid.uuid4()) + '.tmp'
outFile = open(tmpfile,'wb')
#cPickle the current data to a temp file
cPickle.dump(data, outFile)
outFile.close()
#replace the pickle file with the new temp file
replace(tmpfile, filepath)
#remove any extraneous temp files
for f in glob.glob("*.tmp"):
os.remove(f)
except Exception, e:
logger.warn("Error creating cache file")
def readCachedObj(filepath):
#read cPickle file and return data as object
try:
inFile = open(filepath,'rb')
cache = cPickle.load(inFile)
inFile.close()
return cache
except Exception, e:
logger.warn("Error reading cached data cPickle")
永远不要覆盖现有文件。相反,写入 new 文件,然后在成功关闭它后执行(原子)重命名。
这似乎适合我。通过创建新的临时文件然后使用文件覆盖,cacheData 函数在处理最终的 pickle 文件时会更加快捷。
def replace(src, dst):
win32api.MoveFileEx(src, dst, win32con.MOVEFILE_REPLACE_EXISTING)
def cacheData(filepath, data):
#create a cPickle file to cache the current data
try:
tmpfile = str(uuid.uuid4()) + '.tmp'
outFile = open(tmpfile,'wb')
#cPickle the current data to a temp file
cPickle.dump(data, outFile)
outFile.close()
#replace the pickle file with the new temp file
replace(tmpfile, filepath)
#remove any extraneous temp files
for f in glob.glob("*.tmp"):
os.remove(f)
except Exception, e:
logger.warn("Error creating cache file")
def readCachedObj(filepath):
#read cPickle file and return data as object
try:
inFile = open(filepath,'rb')
cache = cPickle.load(inFile)
inFile.close()
return cache
except Exception, e:
logger.warn("Error reading cached data cPickle")
我需要在从两个不同来源启动的两个 Python 脚本之间传递数据。我看到我的记录器出现读取错误。有没有更好的方法?
程序 A: 大约每分钟定期写入 pickle 文件。
def cacheData(filepath, data):
#create a cPickle file to cache the current data
try:
outFile = open(filepath,'wb')
#cPickle the current data to a file
cPickle.dump(data, outFile)
outFile.close()
except Exception, e:
logger.warn("Error creating cache file")
程序B:是由用户启动的已编译可执行文件。它读取 pickle 文件以启动一些代码。
def readCachedObj(filepath):
#read cPickle file and return data as object
try:
inFile = open(filepath,'rb')
cache = cPickle.load(inFile)
inFile.close()
return cache
except Exception, e:
logger.warn("Error reading cached data cPickle")
更新 1
def replace(src, dst):
win32api.MoveFileEx(src, dst, win32con.MOVEFILE_REPLACE_EXISTING)
def cacheData(filepath, data):
#create a cPickle file to cache the current data
try:
tmpfile = str(uuid.uuid4()) + '.tmp'
outFile = open(tmpfile,'wb')
#cPickle the current data to a temp file
cPickle.dump(data, outFile)
outFile.close()
#replace the pickle file with the new temp file
replace(tmpfile, filepath)
#remove any extraneous temp files
for f in glob.glob("*.tmp"):
os.remove(f)
except Exception, e:
logger.warn("Error creating cache file")
def readCachedObj(filepath):
#read cPickle file and return data as object
try:
inFile = open(filepath,'rb')
cache = cPickle.load(inFile)
inFile.close()
return cache
except Exception, e:
logger.warn("Error reading cached data cPickle")
永远不要覆盖现有文件。相反,写入 new 文件,然后在成功关闭它后执行(原子)重命名。
这似乎适合我。通过创建新的临时文件然后使用文件覆盖,cacheData 函数在处理最终的 pickle 文件时会更加快捷。
def replace(src, dst):
win32api.MoveFileEx(src, dst, win32con.MOVEFILE_REPLACE_EXISTING)
def cacheData(filepath, data):
#create a cPickle file to cache the current data
try:
tmpfile = str(uuid.uuid4()) + '.tmp'
outFile = open(tmpfile,'wb')
#cPickle the current data to a temp file
cPickle.dump(data, outFile)
outFile.close()
#replace the pickle file with the new temp file
replace(tmpfile, filepath)
#remove any extraneous temp files
for f in glob.glob("*.tmp"):
os.remove(f)
except Exception, e:
logger.warn("Error creating cache file")
def readCachedObj(filepath):
#read cPickle file and return data as object
try:
inFile = open(filepath,'rb')
cache = cPickle.load(inFile)
inFile.close()
return cache
except Exception, e:
logger.warn("Error reading cached data cPickle")