Problem on OpenGL context sharing and pixel formats

4 posts / 0 new
Last post
alexdlabs
Offline
Last seen: 5 days 3 hours ago
Joined: 8 Mar 2011 - 09:12
Problem on OpenGL context sharing and pixel formats

Juce OpenGL Context sharing can fail when special pixel format are provided.

In my case on a Win7-64-SP1 Nvidia GTX295 WHQL driver 301.42:
If I share a default value OpenGLPixelFormat context with another context with a OpenGLPixelFormat.depthBufferBits value set to 32.

Context creation will succeed with the 7 pixel format for the first context and 97 for the second one. But wglShareLists call will fail.

The problem is due to the missing specification of the WGL_ACCELERATION_ARB attribute value on wglChoosePixelFormatARB call:
The first context is created by an ICD OpenGL driver, the second by an MCD OpenGL driver.

Context sharing on Window is allowed only if they use the same implementation of OpenGL function.
(CF MSDN wglShareLists doc)

A quick way to solve this problem is to enforce usage of full accelerated driver (ICD OpenGL driver) and make wglChoosePixelFormatExtension fail if requested pixel format is not available in full acceleration mode.

By adding the following line in juce::OpenGLContext::NativeContext::wglChoosePixelFormatExtension() method

atts[n++] = WGL_ACCELERATION_ARB;     atts[n++] = WGL_FULL_ACCELERATION_ARB;

Another way is to add a bool in OpenGLPixelFormat class, allowing caller to choose acceleration mode. But this second way is probably not fully cross platform.

Thanks for your concern about this issue.

alexdlabs
Offline
Last seen: 5 days 3 hours ago
Joined: 8 Mar 2011 - 09:12
Re: Problem on OpenGL context sharing and pixel formats

Just to complete this post about juce::OpenGLPixelFormat

When ChoosePixelFormat or wglChoosePixelFormatARB function succeed,
returned pixel format can have some differences with their input parameters:

When you ask a 16bits depth buffer, you get a pixel format with 24bits depth buffer.
With a non-power-of-two multisampling value, the next greater power of two is returned...

This is probably due to driver/hardware/os implementation,
It's probably true on GLX and others plateform.

It could be great to have a new method like
OpenGLPixelFormat OpenGLContext::getPixelFormat() const
returning an OpenGLPixelFormat updated by a wglGetPixelFormatAttribivARB call after context creation.

I think this could increase the insight on resources consumption.
And so it's a must to have !
Do you agree ?
Thanks again for reading this.

jules
Online
Last seen: 3 min 33 sec ago
Joined: 29 Apr 2013 - 18:37
Re: Problem on OpenGL context sharing and pixel formats

Interesting, thanks! Adding those attributes sounds like a good idea, I'll do that now..

alexdlabs
Offline
Last seen: 5 days 3 hours ago
Joined: 8 Mar 2011 - 09:12
Re: Problem on OpenGL context sharing and pixel formats

Thanks a lot for your reactivity about the context acceleration/sharing problem on Windows.

About my second post, I just discover that according to WGL_ARB_pixel_format specification, value set for attributes WGL_COLOR_BITS_ARB, WGL_RED_BITS_ARB, WGL_GREEN_BITS_ARB, WGL_BLUE_BITS_ARB, WGL_ALPHA_BITS_ARB, WGL_DEPTH_BITS_ARB, WGL_STENCIL_BITS_ARB...
are wanted minimum values !

This explain why implementation is free to return greater value when strictly equal values are not available.
Linux glXChooseFBConfig documentation say the same thing.
EglChooseConfig says that too.

So calling glXGetFBConfigAttrib on Linux or wglGetPixelFormatAttribivARB on Windows after context creation could allow Juce returning the real obtained OpenGLPixelFormat to Juce::OpenGLContext caller.
I Think it's a good feature to have, if I'm not the only one interested of course.

PS: I don't know AGL but NSOpenGLPixelFormat have an interesting NSOpenGLPFAMinimumPolicy. I hope all those infos could help you in your endless great work on Juce.