网络摄像头源到 EVR 接收器
Webcam source to EVR sink
我可以通过将字节样本直接写入 Enhanced Video Renderer (EVR) sink (thanks to answer on ) 来显示来自 mp4 视频的视频流。
除了网络摄像头源,我也想做同样的事情。我目前遇到的问题是我的网络摄像头仅支持 RGB24 和 I420 格式,据我所知,EVR 仅支持 RGB32。在某些 Media Foundation 场景中,我相信转换会自动发生,前提是在此过程中注册了 CColorConvertDMO
class。我已经这样做了,但我怀疑由于我将样本写入 EVR 的方式,颜色转换没有被调用。
我的问题是我应该采取何种方法允许从我的网络摄像头读取 RGB24 样本 IMFSourceReader
以允许写入 EVR IMFStreamSink
?
我的完整示例程序是 here,不幸的是,由于需要 Media Foundation 管道,它相当长。下面是我尝试将 EVR 接收器媒体类型与网络摄像头源媒体类型相匹配的块。
问题出在 MF_MT_SUBTYPE
属性的设置上。据我所知,对于 EVR,tt 必须是 MFVideoFormat_RGB32
,但我的网络摄像头只接受 MFVideoFormat_RGB24
.
IMFMediaSource* pVideoSource = NULL;
IMFSourceReader* pVideoReader = NULL;
IMFMediaType* videoSourceOutputType = NULL, * pvideoSourceModType = NULL;
IMFMediaType* pVideoOutType = NULL;
IMFMediaType* pHintMediaType = NULL;
IMFMediaSink* pVideoSink = NULL;
IMFStreamSink* pStreamSink = NULL;
IMFSinkWriter* pSinkWriter = NULL;
IMFMediaTypeHandler* pSinkMediaTypeHandler = NULL, * pSourceMediaTypeHandler = NULL;
IMFPresentationDescriptor* pSourcePresentationDescriptor = NULL;
IMFStreamDescriptor* pSourceStreamDescriptor = NULL;
IMFVideoRenderer* pVideoRenderer = NULL;
IMFVideoDisplayControl* pVideoDisplayControl = NULL;
IMFGetService* pService = NULL;
IMFActivate* pActive = NULL;
IMFPresentationClock* pClock = NULL;
IMFPresentationTimeSource* pTimeSource = NULL;
IDirect3DDeviceManager9* pD3DManager = NULL;
IMFVideoSampleAllocator* pVideoSampleAllocator = NULL;
IMFSample* pD3DVideoSample = NULL;
RECT rc = { 0, 0, VIDEO_WIDTH, VIDEO_HEIGHT };
BOOL fSelected = false;
CHECK_HR(CoInitializeEx(NULL, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE),
"COM initialisation failed.");
CHECK_HR(MFStartup(MF_VERSION),
"Media Foundation initialisation failed.");
//CHECK_HR(ListCaptureDevices(DeviceType::Video),
// "Error listing video capture devices.");
// Need the color converter DSP for conversions between YUV, RGB etc.
CHECK_HR(MFTRegisterLocalByCLSID(
__uuidof(CColorConvertDMO),
MFT_CATEGORY_VIDEO_PROCESSOR,
L"",
MFT_ENUM_FLAG_SYNCMFT,
0,
NULL,
0,
NULL),
"Error registering colour converter DSP.");
// Create a separate Window and thread to host the Video player.
CreateThread(NULL, 0, (LPTHREAD_START_ROUTINE)InitializeWindow, NULL, 0, NULL);
Sleep(1000);
if (_hwnd == nullptr)
{
printf("Failed to initialise video window.\n");
goto done;
}
if (_hwnd == nullptr)
{
printf("Failed to initialise video window.\n");
goto done;
}
// ----- Set up Video sink (Enhanced Video Renderer). -----
CHECK_HR(MFCreateVideoRendererActivate(_hwnd, &pActive),
"Failed to created video rendered activation context.");
CHECK_HR(pActive->ActivateObject(IID_IMFMediaSink, (void**)&pVideoSink),
"Failed to activate IMFMediaSink interface on video sink.");
// Initialize the renderer before doing anything else including querying for other interfaces,
// see https://msdn.microsoft.com/en-us/library/windows/desktop/ms704667(v=vs.85).aspx.
CHECK_HR(pVideoSink->QueryInterface(__uuidof(IMFVideoRenderer), (void**)&pVideoRenderer),
"Failed to get video Renderer interface from EVR media sink.");
CHECK_HR(pVideoRenderer->InitializeRenderer(NULL, NULL),
"Failed to initialise the video renderer.");
CHECK_HR(pVideoSink->QueryInterface(__uuidof(IMFGetService), (void**)&pService),
"Failed to get service interface from EVR media sink.");
CHECK_HR(pService->GetService(MR_VIDEO_RENDER_SERVICE, __uuidof(IMFVideoDisplayControl), (void**)&pVideoDisplayControl),
"Failed to get video display control interface from service interface.");
CHECK_HR(pVideoDisplayControl->SetVideoWindow(_hwnd),
"Failed to SetVideoWindow.");
CHECK_HR(pVideoDisplayControl->SetVideoPosition(NULL, &rc),
"Failed to SetVideoPosition.");
CHECK_HR(pVideoSink->GetStreamSinkByIndex(0, &pStreamSink),
"Failed to get video renderer stream by index.");
CHECK_HR(pStreamSink->GetMediaTypeHandler(&pSinkMediaTypeHandler),
"Failed to get media type handler for stream sink.");
DWORD sinkMediaTypeCount = 0;
CHECK_HR(pSinkMediaTypeHandler->GetMediaTypeCount(&sinkMediaTypeCount),
"Failed to get sink media type count.");
std::cout << "Sink media type count: " << sinkMediaTypeCount << "." << std::endl;
// ----- Set up Video source (is either a file or webcam capture device). -----
#if USE_WEBCAM_SOURCE
CHECK_HR(GetVideoSourceFromDevice(WEBCAM_DEVICE_INDEX, &pVideoSource, &pVideoReader),
"Failed to get webcam video source.");
#else
CHECK_HR(GetVideoSourceFromFile(MEDIA_FILE_PATH, &pVideoSource, &pVideoReader),
"Failed to get file video source.");
#endif
CHECK_HR(pVideoReader->GetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, &videoSourceOutputType),
"Error retrieving current media type from first video stream.");
CHECK_HR(pVideoReader->SetStreamSelection((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, TRUE),
"Failed to set the first video stream on the source reader.");
CHECK_HR(pVideoSource->CreatePresentationDescriptor(&pSourcePresentationDescriptor),
"Failed to create the presentation descriptor from the media source.");
CHECK_HR(pSourcePresentationDescriptor->GetStreamDescriptorByIndex(0, &fSelected, &pSourceStreamDescriptor),
"Failed to get source stream descriptor from presentation descriptor.");
CHECK_HR(pSourceStreamDescriptor->GetMediaTypeHandler(&pSourceMediaTypeHandler),
"Failed to get source media type handler.");
DWORD srcMediaTypeCount = 0;
CHECK_HR(pSourceMediaTypeHandler->GetMediaTypeCount(&srcMediaTypeCount),
"Failed to get source media type count.");
std::cout << "Source media type count: " << srcMediaTypeCount << ", is first stream selected " << fSelected << "." << std::endl;
std::cout << "Default output media type for source reader:" << std::endl;
std::cout << GetMediaTypeDescription(videoSourceOutputType) << std::endl << std::endl;
// ----- Create a compatible media type and set on the source and sink. -----
// Set the video input type on the EVR sink.
CHECK_HR(MFCreateMediaType(&pVideoOutType), "Failed to create video output media type.");
CHECK_HR(pVideoOutType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video), "Failed to set video output media major type.");
CHECK_HR(pVideoOutType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB32), "Failed to set video sub-type attribute on media type.");
CHECK_HR(pVideoOutType->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive), "Failed to set interlace mode attribute on media type.");
CHECK_HR(pVideoOutType->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE), "Failed to set independent samples attribute on media type.");
CHECK_HR(MFSetAttributeRatio(pVideoOutType, MF_MT_PIXEL_ASPECT_RATIO, 1, 1), "Failed to set pixel aspect ratio attribute on media type.");
CHECK_HR(CopyAttribute(videoSourceOutputType, pVideoOutType, MF_MT_FRAME_SIZE), "Failed to copy video frame size attribute to media type.");
CHECK_HR(CopyAttribute(videoSourceOutputType, pVideoOutType, MF_MT_FRAME_RATE), "Failed to copy video frame rate attribute to media type.");
//CHECK_HR(GetSupportedMediaType(pMediaTypeHandler, &pVideoOutType),
// "Failed to get supported media type.");
std::cout << "Custom media type defined as:" << std::endl;
std::cout << GetMediaTypeDescription(pVideoOutType) << std::endl << std::endl;
auto doesSinkSupport = pSinkMediaTypeHandler->IsMediaTypeSupported(pVideoOutType, &pHintMediaType);
if (doesSinkSupport != S_OK) {
std::cout << "Sink does not support desired media type." << std::endl;
goto done;
}
else {
CHECK_HR(pSinkMediaTypeHandler->SetCurrentMediaType(pVideoOutType),
"Failed to set input media type on EVR sink.");
}
// The block below returnedalways failed furing testing. My guess is the source media type handler
// is not aligned with the video reader somehow.
/*auto doesSrcSupport = pSourceMediaTypeHandler->IsMediaTypeSupported(pVideoOutType, &pHintMediaType);
if (doesSrcSupport != S_OK) {
std::cout << "Source does not support desired media type." << std::endl;
goto done;
}
else {
CHECK_HR(pSourceMediaTypeHandler->SetCurrentMediaType(pVideoOutType),
"Failed to set output media type on source reader.");
}*/
CHECK_HR(pVideoReader->SetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, NULL, pVideoOutType),
"Failed to set output media type on source reader.");
// ----- Source and sink now configured. Set up remaining infrastructure and then start sampling. -----
我需要手动连接颜色转换 MFT(我很确定某些 Media Foundation 场景会自动连接它,但可能仅在使用拓扑时才连接)并调整提供给 EVR 的 Direct3D IMFSample 上的时钟设置.
工作example。
源 Reader 通常能够进行此转换:RGB24 -> RGB32。
as far as I can tell the EVR only supports RGB32
不一定,这取决于您的视频处理器:mofo7777
/
计算器
在 MFVideoEVR 项目下,将所有 MFVideoFormat_RGB32 替换为 MFVideoFormat_NV12,它应该适用于 NVidia GPU 卡。改变睡眠(20);睡眠(40);在 Main.cpp (HRESULT DisplayVideo(...)) 中,因为使用 NV12 格式更加优化(25 fps 视频帧速率的值)。
关于您的问题:
无需处理颜色转换 MFT 即可。来自 MFVideoEVR,有两件事要更新:
- 设置视频捕获源而不是视频文件源
- 手动处理采样时间,因为捕获采样时间不准确
源代码在这里:mofo7777
/
计算器
在 MFVideoCaptureEVR 项目下。
我可以通过将字节样本直接写入 Enhanced Video Renderer (EVR) sink (thanks to answer on
除了网络摄像头源,我也想做同样的事情。我目前遇到的问题是我的网络摄像头仅支持 RGB24 和 I420 格式,据我所知,EVR 仅支持 RGB32。在某些 Media Foundation 场景中,我相信转换会自动发生,前提是在此过程中注册了 CColorConvertDMO
class。我已经这样做了,但我怀疑由于我将样本写入 EVR 的方式,颜色转换没有被调用。
我的问题是我应该采取何种方法允许从我的网络摄像头读取 RGB24 样本 IMFSourceReader
以允许写入 EVR IMFStreamSink
?
我的完整示例程序是 here,不幸的是,由于需要 Media Foundation 管道,它相当长。下面是我尝试将 EVR 接收器媒体类型与网络摄像头源媒体类型相匹配的块。
问题出在 MF_MT_SUBTYPE
属性的设置上。据我所知,对于 EVR,tt 必须是 MFVideoFormat_RGB32
,但我的网络摄像头只接受 MFVideoFormat_RGB24
.
IMFMediaSource* pVideoSource = NULL;
IMFSourceReader* pVideoReader = NULL;
IMFMediaType* videoSourceOutputType = NULL, * pvideoSourceModType = NULL;
IMFMediaType* pVideoOutType = NULL;
IMFMediaType* pHintMediaType = NULL;
IMFMediaSink* pVideoSink = NULL;
IMFStreamSink* pStreamSink = NULL;
IMFSinkWriter* pSinkWriter = NULL;
IMFMediaTypeHandler* pSinkMediaTypeHandler = NULL, * pSourceMediaTypeHandler = NULL;
IMFPresentationDescriptor* pSourcePresentationDescriptor = NULL;
IMFStreamDescriptor* pSourceStreamDescriptor = NULL;
IMFVideoRenderer* pVideoRenderer = NULL;
IMFVideoDisplayControl* pVideoDisplayControl = NULL;
IMFGetService* pService = NULL;
IMFActivate* pActive = NULL;
IMFPresentationClock* pClock = NULL;
IMFPresentationTimeSource* pTimeSource = NULL;
IDirect3DDeviceManager9* pD3DManager = NULL;
IMFVideoSampleAllocator* pVideoSampleAllocator = NULL;
IMFSample* pD3DVideoSample = NULL;
RECT rc = { 0, 0, VIDEO_WIDTH, VIDEO_HEIGHT };
BOOL fSelected = false;
CHECK_HR(CoInitializeEx(NULL, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE),
"COM initialisation failed.");
CHECK_HR(MFStartup(MF_VERSION),
"Media Foundation initialisation failed.");
//CHECK_HR(ListCaptureDevices(DeviceType::Video),
// "Error listing video capture devices.");
// Need the color converter DSP for conversions between YUV, RGB etc.
CHECK_HR(MFTRegisterLocalByCLSID(
__uuidof(CColorConvertDMO),
MFT_CATEGORY_VIDEO_PROCESSOR,
L"",
MFT_ENUM_FLAG_SYNCMFT,
0,
NULL,
0,
NULL),
"Error registering colour converter DSP.");
// Create a separate Window and thread to host the Video player.
CreateThread(NULL, 0, (LPTHREAD_START_ROUTINE)InitializeWindow, NULL, 0, NULL);
Sleep(1000);
if (_hwnd == nullptr)
{
printf("Failed to initialise video window.\n");
goto done;
}
if (_hwnd == nullptr)
{
printf("Failed to initialise video window.\n");
goto done;
}
// ----- Set up Video sink (Enhanced Video Renderer). -----
CHECK_HR(MFCreateVideoRendererActivate(_hwnd, &pActive),
"Failed to created video rendered activation context.");
CHECK_HR(pActive->ActivateObject(IID_IMFMediaSink, (void**)&pVideoSink),
"Failed to activate IMFMediaSink interface on video sink.");
// Initialize the renderer before doing anything else including querying for other interfaces,
// see https://msdn.microsoft.com/en-us/library/windows/desktop/ms704667(v=vs.85).aspx.
CHECK_HR(pVideoSink->QueryInterface(__uuidof(IMFVideoRenderer), (void**)&pVideoRenderer),
"Failed to get video Renderer interface from EVR media sink.");
CHECK_HR(pVideoRenderer->InitializeRenderer(NULL, NULL),
"Failed to initialise the video renderer.");
CHECK_HR(pVideoSink->QueryInterface(__uuidof(IMFGetService), (void**)&pService),
"Failed to get service interface from EVR media sink.");
CHECK_HR(pService->GetService(MR_VIDEO_RENDER_SERVICE, __uuidof(IMFVideoDisplayControl), (void**)&pVideoDisplayControl),
"Failed to get video display control interface from service interface.");
CHECK_HR(pVideoDisplayControl->SetVideoWindow(_hwnd),
"Failed to SetVideoWindow.");
CHECK_HR(pVideoDisplayControl->SetVideoPosition(NULL, &rc),
"Failed to SetVideoPosition.");
CHECK_HR(pVideoSink->GetStreamSinkByIndex(0, &pStreamSink),
"Failed to get video renderer stream by index.");
CHECK_HR(pStreamSink->GetMediaTypeHandler(&pSinkMediaTypeHandler),
"Failed to get media type handler for stream sink.");
DWORD sinkMediaTypeCount = 0;
CHECK_HR(pSinkMediaTypeHandler->GetMediaTypeCount(&sinkMediaTypeCount),
"Failed to get sink media type count.");
std::cout << "Sink media type count: " << sinkMediaTypeCount << "." << std::endl;
// ----- Set up Video source (is either a file or webcam capture device). -----
#if USE_WEBCAM_SOURCE
CHECK_HR(GetVideoSourceFromDevice(WEBCAM_DEVICE_INDEX, &pVideoSource, &pVideoReader),
"Failed to get webcam video source.");
#else
CHECK_HR(GetVideoSourceFromFile(MEDIA_FILE_PATH, &pVideoSource, &pVideoReader),
"Failed to get file video source.");
#endif
CHECK_HR(pVideoReader->GetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, &videoSourceOutputType),
"Error retrieving current media type from first video stream.");
CHECK_HR(pVideoReader->SetStreamSelection((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, TRUE),
"Failed to set the first video stream on the source reader.");
CHECK_HR(pVideoSource->CreatePresentationDescriptor(&pSourcePresentationDescriptor),
"Failed to create the presentation descriptor from the media source.");
CHECK_HR(pSourcePresentationDescriptor->GetStreamDescriptorByIndex(0, &fSelected, &pSourceStreamDescriptor),
"Failed to get source stream descriptor from presentation descriptor.");
CHECK_HR(pSourceStreamDescriptor->GetMediaTypeHandler(&pSourceMediaTypeHandler),
"Failed to get source media type handler.");
DWORD srcMediaTypeCount = 0;
CHECK_HR(pSourceMediaTypeHandler->GetMediaTypeCount(&srcMediaTypeCount),
"Failed to get source media type count.");
std::cout << "Source media type count: " << srcMediaTypeCount << ", is first stream selected " << fSelected << "." << std::endl;
std::cout << "Default output media type for source reader:" << std::endl;
std::cout << GetMediaTypeDescription(videoSourceOutputType) << std::endl << std::endl;
// ----- Create a compatible media type and set on the source and sink. -----
// Set the video input type on the EVR sink.
CHECK_HR(MFCreateMediaType(&pVideoOutType), "Failed to create video output media type.");
CHECK_HR(pVideoOutType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video), "Failed to set video output media major type.");
CHECK_HR(pVideoOutType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB32), "Failed to set video sub-type attribute on media type.");
CHECK_HR(pVideoOutType->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive), "Failed to set interlace mode attribute on media type.");
CHECK_HR(pVideoOutType->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE), "Failed to set independent samples attribute on media type.");
CHECK_HR(MFSetAttributeRatio(pVideoOutType, MF_MT_PIXEL_ASPECT_RATIO, 1, 1), "Failed to set pixel aspect ratio attribute on media type.");
CHECK_HR(CopyAttribute(videoSourceOutputType, pVideoOutType, MF_MT_FRAME_SIZE), "Failed to copy video frame size attribute to media type.");
CHECK_HR(CopyAttribute(videoSourceOutputType, pVideoOutType, MF_MT_FRAME_RATE), "Failed to copy video frame rate attribute to media type.");
//CHECK_HR(GetSupportedMediaType(pMediaTypeHandler, &pVideoOutType),
// "Failed to get supported media type.");
std::cout << "Custom media type defined as:" << std::endl;
std::cout << GetMediaTypeDescription(pVideoOutType) << std::endl << std::endl;
auto doesSinkSupport = pSinkMediaTypeHandler->IsMediaTypeSupported(pVideoOutType, &pHintMediaType);
if (doesSinkSupport != S_OK) {
std::cout << "Sink does not support desired media type." << std::endl;
goto done;
}
else {
CHECK_HR(pSinkMediaTypeHandler->SetCurrentMediaType(pVideoOutType),
"Failed to set input media type on EVR sink.");
}
// The block below returnedalways failed furing testing. My guess is the source media type handler
// is not aligned with the video reader somehow.
/*auto doesSrcSupport = pSourceMediaTypeHandler->IsMediaTypeSupported(pVideoOutType, &pHintMediaType);
if (doesSrcSupport != S_OK) {
std::cout << "Source does not support desired media type." << std::endl;
goto done;
}
else {
CHECK_HR(pSourceMediaTypeHandler->SetCurrentMediaType(pVideoOutType),
"Failed to set output media type on source reader.");
}*/
CHECK_HR(pVideoReader->SetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, NULL, pVideoOutType),
"Failed to set output media type on source reader.");
// ----- Source and sink now configured. Set up remaining infrastructure and then start sampling. -----
我需要手动连接颜色转换 MFT(我很确定某些 Media Foundation 场景会自动连接它,但可能仅在使用拓扑时才连接)并调整提供给 EVR 的 Direct3D IMFSample 上的时钟设置.
工作example。
源 Reader 通常能够进行此转换:RGB24 -> RGB32。
as far as I can tell the EVR only supports RGB32
不一定,这取决于您的视频处理器:mofo7777 / 计算器
在 MFVideoEVR 项目下,将所有 MFVideoFormat_RGB32 替换为 MFVideoFormat_NV12,它应该适用于 NVidia GPU 卡。改变睡眠(20);睡眠(40);在 Main.cpp (HRESULT DisplayVideo(...)) 中,因为使用 NV12 格式更加优化(25 fps 视频帧速率的值)。
关于您的问题:
无需处理颜色转换 MFT 即可。来自 MFVideoEVR,有两件事要更新:
- 设置视频捕获源而不是视频文件源
- 手动处理采样时间,因为捕获采样时间不准确
源代码在这里:mofo7777 / 计算器
在 MFVideoCaptureEVR 项目下。