通过ffmpeg发布时的视频色差

Video colour difference when publishing via ffmpeg

我正在尝试使用 ffmpeg 发布视频。对于发布,我使用 python 帧图像作为输入源。但是在流式传输时,视频颜色不同。

p = Popen(['ffmpeg', '-y', '-f', 'image2pipe', '-r', '30', '-i', '-', '-vcodec', 'mpeg4', '-qscale', '5', '-r', '30', '-b:a', '32k', '-ar', '44100', '-pix_fmt','rgb24', '-color_range', '3', '-c:v', 'libx264rgb', '-f', 'flv', 'rtmp://stream_ip/live/bbb'], stdin=PIPE)

如何调整 ffmpeg 命令以提供具有原始颜色的图像?

控制台

ffmpeg version 4.1.8-0+deb10u1 Copyright (c) 2000-2021 the FFmpeg 
developers
built with gcc 8 (Debian 8.3.0-6)
configuration: --prefix=/usr --extra-version=0+deb10u1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
libavutil      56. 22.100 / 56. 22.100
libavcodec     58. 35.100 / 58. 35.100
libavformat    58. 20.100 / 58. 20.100
libavdevice    58.  5.100 / 58.  5.100
libavfilter     7. 40.101 /  7. 40.101
libavresample   4.  0.  0 /  4.  0.  0
libswscale      5.  3.100 /  5.  3.100
libswresample   3.  3.100 /  3.  3.100
libpostproc    55.  3.100 / 55.  3.100
Input #0, image2pipe, from 'pipe:':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1152x720 [SAR 1:1 DAR 8:5], 30 fps, 30 tbr, 30 tbn, 30 tbc
Codec AVOption b (set bitrate (in bits/s)) specified for output file #0 
(rtmp://ip/live/bbb) has not been used for any stream. The most likely 
reason is either wrong type (e.g. a video option with no video streams) 
or that it is a private option of some encoder which was not actually 
used for any stream.
Stream mapping:
 Stream #0:0 -> #0:0 (mjpeg (native) -> flv1 (flv))
[swscaler @ 0x5639066a7f80] deprecated pixel format used, make sure you 
did set range correctly
Output #0, flv, to 'rtmp://ip/live/bbb':
  Metadata:
    encoder         : Lavf58.20.100
    Stream #0:0: Video: flv1 (flv) ([2][0][0][0] / 0x0002), yuv420p, 1152x720 [SAR 1:1 DAR 8:5], q=2-31, 200 kb/s, 30 fps, 1k tbn, 30 tbc
    Metadata:
       encoder         : Lavc58.35.100 flv
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 2917 fps= 29 q=31.0 size=    8834kB time=00:01:37.20 bitrate= 744.5kbits/s speed=0.952x  

如果您正在将 JPEG、PNG 或视频读入 OpenCV,它将以 BGR 通道顺序将它们保存在内存中。如果您将此类帧输入 ffmpeg,您必须:

  • 先在 OpenCV 中用 cv2.cvtColor(... cv2.COLOR_BGR2RGB) 将帧转换为 RGB,然后再发送到 ffmpeg,或
  • 通过将 -pix_fmt bgr24 放在输入说明符 之前,即 before[=25],告诉 ffmpeg 帧是按 BGR 顺序排列的=] -i -