Python Pillow - ValueError: Decompressed Data Too Large

Python Pillow - ValueError: Decompressed Data Too Large

我使用 Pillow 库创建缩略图。我要创建很多,实际上超过10.000

程序运行良好,但在处理大约 1.500 后,出现以下错误:

    Traceback (most recent call last):
  File "thumb.py", line 15, in <module>
    im = Image.open('/Users/Marcel/images/07032017/' + infile)
  File "/Users/Marcel/product-/PIL/Image.py", line 2339, in open
    im = _open_core(fp, filename, prefix)
  File "/Users/Marcel/product-/PIL/Image.py", line 2329, in _open_core
    im = factory(fp, filename)
  File "/Users/Marcel/product-/PIL/ImageFile.py", line 97, in __init__
    self._open()
  File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 538, in _open
    s = self.png.call(cid, pos, length)
  File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 136, in call
    return getattr(self, "chunk_" + cid.decode('ascii'))(pos, length)
  File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 319, in chunk_iCCP
    icc_profile = _safe_zlib_decompress(s[i+2:])
  File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 90, in _safe_zlib_decompress
    raise ValueError("Decompressed Data Too Large")
ValueError: Decompressed Data Too Large

我的程序很简单:

import os, sys
import PIL
from PIL import Image

size = 235, 210
reviewedProductsList = open('products.txt', 'r')
reviewedProducts = reviewedProductsList.readlines()
t = map(lambda s: s.strip(), reviewedProducts)

print "Thumbs to create: '%s'" % len(reviewedProducts)

for infile in t:
    outfile = infile
    try:
        im = Image.open('/Users/Marcel/images/07032017/' + infile)
        im.thumbnail(size, Image.ANTIALIAS)
        print "thumb created"
        im.save('/Users/Marcel/product-/thumbs/' + outfile, "JPEG")
    except IOError, e:
        print "cannot create thumbnail for '%s'" % infile
        print "error: '%s'" % e

我正在我的 MacBook Pro 上本地执行此操作。

这是为了防止服务器受到潜在的 DoS 攻击 运行 由减压炸弹引起的 Pillow。当发现解压缩的图像具有太大的元数据时,就会发生这种情况。参见 http://pillow.readthedocs.io/en/4.0.x/handbook/image-file-formats.html?highlight=decompression#png

CVE 报告如下:https://www.cvedetails.com/cve/CVE-2014-9601/

来自最近一期:

If you set ImageFile.LOAD_TRUNCATED_IMAGES to true, it will suppress the error (but still not read the large metadata). Alternately, you can change set the values here: https://github.com/python-pillow/Pillow/ blob/master/PIL/PngImagePlugin.py#L74

https://github.com/python-pillow/Pillow/issues/2445

以下代码可以帮助您设置接受的答案。

from PIL import PngImagePlugin
LARGE_ENOUGH_NUMBER = 100
PngImagePlugin.MAX_TEXT_CHUNK = LARGE_ENOUGH_NUMBER * (1024**2)

没有记录如何设置此值。我希望人们觉得这很有用。