Pages

Saturday, November 21, 2009

OpenGL GL_Select problems with ATI, NVidia

  • Các giải thích về cách install và uninstall các ATI drivers. See here
select and pick object up

The 1st explaination: see here. Bên dưới là một copy của giải thích.
Beware that a BIG drop in performance will exist in ATI or NVIDIA GPUs

The hardware acceleration for GL_Select is only in full power on the high end graphics cards specially for graphics workstation.

The Bad Thing, that's only a matter of drivers, (some folks have hacked the drivers) but not a good solution.

So, if u have lots of objects to pick (like i have , > 10000) it's better that you don't rely on that method for object picking.
The 2nd explaination: see here. Bên dưới là một phiên bản copy lại
As mentioned above, this is very easy to implement.But there's a big drawback,namely the maximum size of the Name Stack.Every OpenGL implementation must at least offer a Name Stack size of 64 names (My GeForce4 Ti4400 with NVidia's Detonator 28.80 offers a Name Stack depth of 128 names).This is not enough,especially for complex scenes.If you need to retrieve information on more objects than your Name Stack is deep,then you'll have to use another method like casting a ray on where your mousecursor is.
The 3nd explaination: see here. Bên dưới là một copy lại

I've started noticing a dramatic drop in the performance of OpenGL rendering in SELECT mode. That is, after a call glRenderMode(GL_SELECT);. GL_SELECT is meant to be faster than regular rendering, because it only needs to render the geometry, and not shading or textures but recent drivers (since July 2007) have started slowing this operation down dramatically.

So far, it only appears to affect ATI graphics drivers. I haven't tested any HD cards -- if you have one, please consider contributing some numbers, using the test program below. Also, the numbers below indicate that GL_RENDER performance may very well also be affected -- by half!. Clearly something has screwed up.

....

Nvidia drivers seem to be unaffected -- if they were doing it too, I might be lead to believe a:

Conspiracy Theory

There's a chance that ATI/AMD (and nvidia) are deliberately disabling this feature of their graphics cards. Games often get by without using it, while high-end graphics workstation applications like Maya depend on it heavily. There's a chance the card manufacturers want you to shell out for a workstation card, like Quadro, if you want to use this feature. But, clearly, the cards can support it just fine!

The 4th explaination: See here.

Those are very clear for us to stop testing on Francois's computer.

No comments:

Post a Comment