libcurl 下载文件大小超过缓冲区大小
libcurl download file size exceed buffer size
我有关于此代码的问题 https://curl.haxx.se/libcurl/c/ftpget.html
在回调函数中
static size_t my_fwrite(void *buffer, size_t size, size_t nmemb, void *stream)
{
struct FtpFile *out=(struct FtpFile *)stream;
if(out && !out->stream) {
/* open file for writing */
out->stream=fopen(out->filename, "wb");
if(!out->stream)
return -1; /* failure, can't open file to write */
}
return fwrite(buffer, size, nmemb, out->stream);
}
如果文件大小超过缓冲区大小怎么办?我认为该函数不会被迭代调用,因为它每次都会覆盖文件。有解决方法吗?谢谢!
来自 curl documentation :
The callback function will be passed as much data as possible in all
invokes, but you must not make any assumptions. It may be one byte, it
may be thousands. The maximum amount of body data that will be passed
to the write callback is defined in the curl.h header file:
CURL_MAX_WRITE_SIZE (the usual default is 16K). If CURLOPT_HEADER is
enabled, which makes header data get passed to the write callback, you
can get up to CURL_MAX_HTTP_HEADER bytes of header data passed into
it. This usually means 100K.
我有关于此代码的问题 https://curl.haxx.se/libcurl/c/ftpget.html
在回调函数中
static size_t my_fwrite(void *buffer, size_t size, size_t nmemb, void *stream)
{
struct FtpFile *out=(struct FtpFile *)stream;
if(out && !out->stream) {
/* open file for writing */
out->stream=fopen(out->filename, "wb");
if(!out->stream)
return -1; /* failure, can't open file to write */
}
return fwrite(buffer, size, nmemb, out->stream);
}
如果文件大小超过缓冲区大小怎么办?我认为该函数不会被迭代调用,因为它每次都会覆盖文件。有解决方法吗?谢谢!
来自 curl documentation :
The callback function will be passed as much data as possible in all invokes, but you must not make any assumptions. It may be one byte, it may be thousands. The maximum amount of body data that will be passed to the write callback is defined in the curl.h header file: CURL_MAX_WRITE_SIZE (the usual default is 16K). If CURLOPT_HEADER is enabled, which makes header data get passed to the write callback, you can get up to CURL_MAX_HTTP_HEADER bytes of header data passed into it. This usually means 100K.