Media Foundation Exposure in milliseconds

916 Views Asked by At

Before I ask my question I need to state that I am primarily a Linux C++ developer and don't have that much experience with writing Windows applications. I compile all code using g++ and NOT a Microsoft compiler.

Background

For some time, I have been working on a scientific imaging application which supports various specialist cameras. Most of these cameras have manufacturer supplied SDKs which make development much easier. However, some of the cameras that my application supports are standard UVC cameras. Until recently, this application has been specifically for Linux but is now being ported to macOS and Windows.

  1. The Linux port implements UVC support via Video4Linux and works exceptionally well.
  2. The macOS port implements UVC support via libuvc (https://github.com/libuvc/libuvc) and works very well

The Problem: Windows UVC Support

I'm having problems implementing UVC cameras on Windows. I have written both DirectShow and a Media Framework processors which work well enough and can control parameters such as gain, brightness, contrast etc just as well as V4L2 which is exactly what I need. However, the problems arise when I try to implement an exposure control.

If I take a specific UVC camera which has a documented exposure range of 1ms to 1000ms then I can use the full exposure range with Video4Linux by using V4L2_CID_EXPOSURE_ABSOLUTE. When the new exposure is applied, I see a nice linear change to the outputted image. I can observe the same when using libuvc on macOS.

When using the same camera on Windows with either DirectShow or Media Foundation the exposure range is very different. I use the following code to query the range of the exposure control:

HRESULT hr;
IAMCameraControl *cam_ctrl = NULL;

long lmin, lmax, lstep, ldft, lcap;

hr = m_imf_media_source->QueryInterface(IID_IAMCameraControl, (void**) &cam_ctrl);
if (FAILED(hr))
{
    throw_error(hr);
}
hr = cam_ctrl->GetRange(CameraControl_Exposure, &lmin, &lmax, &lstep, &ldft, &lcap);

if (FAILED(hr))
{
    if (hr == E_PROP_ID_UNSUPPORTED)
    {
        *min = 0;
        *max = 0;
        *value = 0;
        return false;
    }
    else
    {
        throw_error(hr);
    }
}
*min = static_cast<int>(lmin);
*max = static_cast<int>(lmax);
*value = static_cast<int>(ldft);

return true;

The values discovered from proc_amp->GetRange are, as the documentation states, not in real time units. The values are, for example, -11 through to -3 and for some UVC cameras which support exposures greater than 1000ms, the values are positive. When I apply the exposures to the stream, the steps are very coarse. As already mentioned, the exposure steps on this specific camera under Linux are linear and there are hundreds of steps. In Windows there are only around 14 of them.

My primary question is this: It is possible to use UVC cameras with real time units in either DirectShow or Media Foundations and, if so, how is this done? I'm happy to consider another approach if anybody has any ideas. Sadly, such coarse exposure steps are not acceptable.

I know that libuvc is supported on Windows but it requires a specific driver to be installed which is something that I don't really want my users to do.

All help is greatly appreciated.

Amanda.

0

There are 0 best solutions below